Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce:...

28
Electron Commerce Res (2007) 7: 89–116 DOI 10.1007/s10660-007-9002-9 Privacy and e-commerce: a consumer-centric perspective Rhys Smith · Jianhua Shao Published online: 16 March 2007 © Springer Science+Business Media, LLC 2007 Abstract Privacy is an ancient concept, but its interpretation and application in the area of e-commerce are still new. It is increasingly widely accepted, however, that by giving precedence to consumer privacy bigger benefits can be reaped by all par- ties involved. There has been much investigation into the concept of privacy, legal frameworks for protecting this most impalpable of human values and, more recently, computing technologies that help preserve an individual’s privacy in today’s environ- ment. In this paper we review the historical development of this fundamental concept, discussing how advancements both in society and in technology have challenged the right to privacy, and we survey the existing computing technologies that promote consumer privacy in e-commerce. Our study shows that historically the protection of privacy has been driven primarily both by our understanding of privacy and by the advancement of technology, analyses the limitations of privacy protections for cur- rent e-commerce applications, and identifies directions for the future development of successful privacy enhancing technologies. Keywords E-commerce · Privacy · Privacy enhancing technology 1 Introduction New technologies (which of course includes the Internet) are able to offer to in- dividuals levels of privacy unprecedented in modern human history. However, new technologies are equally able to facilitate gross breaches of an individual’s privacy at equally unprecedented levels. Unfortunately, the latter case seems to be the more R. Smith ( ) · J. Shao School of Computer Science, Cardiff University, Cardiff, CF24 3AA, UK e-mail: [email protected] J. Shao e-mail: [email protected]

Transcript of Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce:...

Page 1: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Electron Commerce Res (2007) 7: 89–116DOI 10.1007/s10660-007-9002-9

Privacy and e-commerce: a consumer-centricperspective

Rhys Smith · Jianhua Shao

Published online: 16 March 2007© Springer Science+Business Media, LLC 2007

Abstract Privacy is an ancient concept, but its interpretation and application in thearea of e-commerce are still new. It is increasingly widely accepted, however, thatby giving precedence to consumer privacy bigger benefits can be reaped by all par-ties involved. There has been much investigation into the concept of privacy, legalframeworks for protecting this most impalpable of human values and, more recently,computing technologies that help preserve an individual’s privacy in today’s environ-ment. In this paper we review the historical development of this fundamental concept,discussing how advancements both in society and in technology have challenged theright to privacy, and we survey the existing computing technologies that promoteconsumer privacy in e-commerce. Our study shows that historically the protection ofprivacy has been driven primarily both by our understanding of privacy and by theadvancement of technology, analyses the limitations of privacy protections for cur-rent e-commerce applications, and identifies directions for the future development ofsuccessful privacy enhancing technologies.

Keywords E-commerce · Privacy · Privacy enhancing technology

1 Introduction

New technologies (which of course includes the Internet) are able to offer to in-dividuals levels of privacy unprecedented in modern human history. However, newtechnologies are equally able to facilitate gross breaches of an individual’s privacyat equally unprecedented levels. Unfortunately, the latter case seems to be the more

R. Smith (�) · J. ShaoSchool of Computer Science, Cardiff University, Cardiff, CF24 3AA, UKe-mail: [email protected]

J. Shaoe-mail: [email protected]

Page 2: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

90 R. Smith, J. Shao

predominant. The commonly cited possibility, penned by Peter Steiner in a cartoon inThe New Yorker (July 5th 1993), exists that “On the Internet, nobody knows you’rea dog.” What is in fact closer to reality in the world of today is that not only cananyone know that you are a dog—they can also find out what breed you are, whatyour favourite food is, your entire history of inoculations, your current geographicallocation (accurate to within one metre), and which is your favourite brand of shoe tochew.

One ever-emerging new technology is the area of e-commerce: although the areahas existed for several years it is a technology that is undoubtedly yet to reach itsfull potential. If one examines the current state of e-commerce and compares thiswith the vision of the future of e-commerce presented several years ago then cur-rent e-commerce levels seem rather lacklustre. One report by Forrester Research es-timated that e-tailers lost US$15 billion in 2001 because of the privacy concerns ofconsumers [28], while a report by Jupiter Research estimates that online retail saleswould be approximately 24 percent higher in 2006 if these concerns were addressedeffectively [39]. Comparing Forrester Research’s 1999 prediction of US$184 billionof US online retail sales in 2004 [27] to the actual figures for 2004—approximatelyUS$69 billion1—shows that there has been an extremely large deficit in predictedrevenue. One of the major reasons for this gulf between predicted and actual levelsthat has been suggested by the research community at large and backed up by manystudies, e.g. [21, 28, 31, 37–39, 81, 82], is simply one of trust: a vast proportion ofindividuals able to engage in e-commerce are not willing to do so (or to do so, butat reduced levels) simply because they do not trust e-commerce sites to be secureand respectful of their privacy. The most recent of these studies was a Gartner sur-vey [31] in June 2005 which shows that this problem is not abating, and predicts thatit will continue to further inhibit e-commerce growth rates in the coming years. Theobvious conclusion is that, to impel e-commerce to boom to the oft-predicted levels,either e-commerce based companies need to increase the trust engendered to themby the general e-consumer, or technologies need to be created that are designed toprotect an individual’s privacy and security—forcefully if necessary.

The first of these possibilities, that of establishing and maintaining more trustedrelationships with customers, is not a trivial matter. Numerous studies investigatingways of encouraging trust have been carried out and a number of mechanisms havebeen proposed, ranging from altering the design of an e-commerce site to make itappear more professional and “friendly” through to facilitating user interaction viathe use of a computer-generated human-like agent. Patton and Jøsang surveyed thecurrent state of this area in [51]. However, trust is viewed as something that is difficultto build and easy to lose [49], and there are doubts as to whether the effectiveness ofthese techniques is enough to subdue the fears of individuals to any large degree.

The other possibility, that of using technologies to protect individual privacy,neatly avoids these problems by actually attempting to allay some of the commonfears of individuals rather than using psychological approaches in a simple attempt tocircumvent these fears. There have been a wide range of these technologies proposedwhich can generally be split into two main methods: those that attempt to preserve an

1Source: US Census Bureau—http://www.census.gov/estats.

Page 3: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 91

individual’s privacy by enabling anonymous communication channels for interactionbetween customer and e-business; and those that attempt to minimise the amount ofpersonal information given to an e-business during the interaction. Whichever tech-nological method is used, this approach of actually allaying individuals’ fears aboutprivacy is probably the more productive of the two possible approaches in the attemptto expand e-commerce levels, as a real solution to the problem will be seen in a morepositive light by the individuals involved.

In order to fully explore the idea that e-commerce would benefit from enhancedconsumer privacy, some fundamental understanding of privacy is required: what ex-actly privacy is, how our understanding of it has changed throughout history, and howthis changing understanding coupled with technological development have histori-cally affected the protection of privacy. This understanding will then give us the nec-essary tools to discuss how privacy exists in the information age that we live in. Thus,in the first part of this paper we firstly review the meaning of “privacy” and discusshow technological advancements in recent times have challenged the right to privacy.We then focus our discussion upon the burgeoning area of e-commerce, where we at-tempt to apply the lessons learned about privacy to the current world of e-commerce,coming from a consumer’s point of view. Given this enhanced understanding of pri-vacy, in the second part of the paper we survey, evaluate, and analyse the limitationsof the existing technologies that promote consumer privacy in e-commerce. Finally,we draw some conclusions about the future of e-commerce and consumer privacy.

The rest of the paper is structured as follows. Section 2 explores the general ideaof privacy, discussing its nature, history, meaning, and how it has been affected bytechnological advances. Section 3 then uses these ideas to discuss the problem ofretaining privacy in e-commerce and the ways in which this problem can be solved.Section 4 reviews some of the currently existing technologies that attempt to pre-serve/enhance privacy in the technological domain and discusses the future of thesetechnologies and areas where more research is needed. Finally, some conclusions aredrawn in Sect. 5.

2 Privacy

“Every man should know that his conversations, his correspondence, and hispersonal life are private.” Lyndon B. Johnson, President of the United States,1963–69.

Privacy, as a concept, is highly interesting. Perhaps its most striking feature is thefact that nobody seems able to agree upon what it actually is. The “right to privacy”has inspired considerable debate in many fields of thinking: including the areas of law,philosophy, sociology, politics, and more recently, computer science. This debate isfascinating, complex, and at times rather surprising. Furthermore, how this right toprivacy fares when applied to the world of e-commerce is an even more contentiousissue.

In our attempt to explore the issues presented in the introduction, and to grasp howthe competing worlds of individual privacy and e-commerce can meet without col-liding head-on, we need to first attempt to understand the general concept of privacy.In this section we will first examine the meaning and nature of privacy, followed bybriefly examining some important points in the historical development of privacy.

Page 4: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

92 R. Smith, J. Shao

2.1 The nature of privacy

At first glance, the idea of privacy seems fairly intuitive. When pressed upon to elu-cidate this idea of privacy in a clear-cut all-encompassing definition and defence ofprivacy however, people consistently flounder. Thus, debate about this most impal-pable of human values has raged throughout recorded history. These debates becameprominent in the philosophy/sociology literature in the latter half of the 20th cen-tury: Benn and Gaus’ anthology [9], Schoeman’s anthology [67], and Weintraub andKumar’s anthology [78] have collected many of the important arguments presentedthroughout these decades.

The most rudimentary area of this debate is one which is concerned with the de-fence of privacy: why is privacy important, and what does retaining privacy bring toan individual? Many theorists have produced defences of privacy that state that pri-vacy is a highly important human value necessary for many aspects of an individual’smoral and social being, such as: privacy being a requirement of the ability to developdiverse and meaningful relationships [30, 55]; privacy being a basic aspect of indi-vidual personality and integrity [30]; privacy being a requirement for human dignityand retaining one’s uniqueness and autonomy [10]; and privacy being a necessaryprerequisite for intimacy [32].

Another area of debate—probably the most fundamental and essential of areas—isthe definition of privacy. The fact is that nobody has yet produced a single agreed-upon definition for the right to privacy—and this is not at all surprising since anindividual’s conception of privacy is based partly upon their society’s general con-ception of privacy (see Westin’s work in [80], taken from his book [79]) and partlyupon their own life’s experiences and general social attitudes. Despite this difficulty,many people have (of course) attempted to produce all-encompassing definitions of“privacy.” Schoeman noted [66] that they all fall into three main categories:

1. The right an individual has in being able to control access to personal informationabout themselves.

2. The measure of control an individual has over information about themselves, orwho has sensory access to them.

3. The state of limited access to an individual and their personal information.

Each of the three proposed categories of definitions has properties that are pleasingin the attempt to produce a single unified definition of privacy, however, they all alsocontain some major problems. The first category is simply a statement about privacythat assumes that privacy is a morally significant human value and therefore some-thing sacred that should be protected, but does not say why this should be the case,and indeed does not define what privacy actually is. The second category’s critics ar-gue that counter-examples can easily be created that question the definition (althoughits proponents claim that such examples are not realistic and to produce them “wouldbe to engage in irony” [30]). The third category inherently poses the question as towhether privacy is desirable, and to what extent. It also raises further questions aboutthe difference between privacy itself and the right to privacy—examples where onecan be said to have lost privacy but not had one’s right to privacy violated (and theconverse) are easily constructed.

Page 5: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 93

More recently, theorists have argued that the reason that no-one has yet produced asingle unified concept of privacy is that privacy is far too complex to actually capturein a single (relatively) simple definition—therefore it should instead be treated as acollection of related concepts. Benn and Gaus, for example, suggest that privacy isa wide-ranging social concept that shapes how an individual sees and interacts withsociety [9]. Many studies have been carried out investigating the cultural relativity ofprivacy: most social theorists claim that privacy is recognised and institutionalised (insome way) in all societies (e.g. [48, 80]), with only a few notable exceptions (e.g. [5]).

The attempt to define privacy has further been complicated with the transition fromthe industrial society to the technological society through to the information society.In the information society, an individual’s concept of self is expanded into provincesadditional to the traditional: it gets expressed through, and affected by, technologyand the projection of one’s identity to include one’s online identity. This additionof a component that exists in a totally different kind of space shifts and blurs thepublic/private boundary, making an attempt to define privacy even more difficult thanever before. Thus, increasing numbers of theorists have started to subscribe to theview that privacy is in fact a collection of concepts rather than one specific concept.

On the other hand, there are those that are more critical of the idea of privacy.Some of the most common arguments against privacy are those such as the view putforward by Prosser and Thomson who hold that privacy is non-distinctive—there isno “right to privacy” [54, 71]. They argue that while privacy is important, thinkingof privacy as something special is unproductive, as any interest that could be cate-gorised as a privacy interest could be equally well explained and better protected byconsidering other interests or rights, such as property rights and the right to bodilysecurity. This view is not actually “anti-privacy” per se, but can be considered crit-ical of it as it questions the whole foundations of privacy as previously discussed.When investigating privacy from a purely legal standpoint, Volokh came to a similarconclusion—that privacy is possibly best protected in law by relying on contractualprotections [74].

Delving deeper into the criticism of privacy, a very highly sceptical view of privacywas produced by Wasserstrom who argued that withholding information about one-self might be morally equivalent to deception, and therefore socially undesirable [76].He suggests that views on privacy encourage individuals to feel more vulnerable thanthey should simply by accepting the notion that there are thoughts and actions whichshould make one feel ashamed or embarrassed, and that privacy encourages hypocrisyand deceit.

Another view—one possibly more interesting in the context of e-commerce—sceptical of the reasons for privacy was introduced by Posner. He argued that privacyinterests are non-distinctive, and are better thought of in economic terms [52]. Heposits that information can have value—people will incur costs to discover it—andthat there therefore are two economic goods: “privacy” and “prying.” He howeverregards them as instrumental rather than “final” goods, allowing them to be analysedeconomically. According to this idea, people do not desire privacy for privacy’s sake,but for the economic or social advantage that it gives them. His view is that pri-vacy should only be protected when allowing access to information would reduce itsvalue. He classifies personal information such as “my ill health, evil temper, even myincome” [52] as facts that should not generally be protected since the main motive for

Page 6: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

94 R. Smith, J. Shao

concealment is often to mislead others, or for private economic gain. Since corporategains enhance the economy more than individual gains, he concludes that defence ofindividual privacy is hard to justify as it can negatively impact these more “impor-tant” corporate gains. Whether one agrees with this stance of course depends entirelyon whether one is a corporation or an individual. Also, thinking specifically aboute-commerce, this view of privacy does not take into account the facts presented in theintroduction—that favouring corporate over individual privacy leads to a reduction inthe amount of e-commerce an individual will engage in.

One thing that all of these arguments—both those for and those against privacy—have in common is that they all agree on the core existence of the concept of privacy:to debate whether something is materially good or bad must necessarily mandate theexistence of the subject of the debate. Thus, virtually all theorists have acknowledgedthat the idea of privacy is indeed a real concept. Additionally, although a single, all-encompassing view of privacy has not yet been presented (and is not likely to ever bepresented), individual privacy (however defined) is seen by the majority of theoristsas a highly important human value. Thus, this value deserves to be fiercely protectedin all areas of life: both the existing areas and the newly developing areas presentedby the information age.

2.2 The development of privacy

The idea of privacy, as with any other human sociological creation, is not absoluteand static—developments in society itself have grown and shaped this human valuefor most of recorded human history. The development of privacy has gone throughseveral main stages in the lead up from its initial articulation in ancient times to itscurrent incarnation in today’s world. Each of these stages can be differentiated bytheir unique view of privacy and how it has been protected.

2.2.1 The infancy of privacy

The concept of privacy has roots dating back many millennia. In approximately 350BC in his treatise Politics, Aristotle distinguished between the public sphere of thecity and its political activities (polis) and the private sphere of the household andits domestic life (oikos). Aristotle claimed that the private sphere has an inherenthierarchical structure whose assets provide the material ability for the citizen to actin the public political sphere. Thus, the presence of the “private” is a necessity for thesmooth running of the “public.”

This idea of public and private spheres were embodied in Greco–Roman society,where the public sphere was not a metaphorical place but an actual physical space—the Roman forum and the Greek agora—where public and legal affairs were discussedand where any free man could directly participate in the running of public life.

Greek society originally conceived of no separation between the two spheres. Theconcept of an individual being separate from the polis was brought to the Greekpeninsula in the fifth century BC by radical Sophists, teaching that the human beingwas the measure of all things—not the city or gods as was the prevailing philoso-phy [61]. This view was alien and highly radical but slowly permeated through Greek

Page 7: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 95

society, being discussed by the great Greek philosophers and ultimately inspiringAristotle’s Politics as attempt to solve this opposition between the opposing views.

The Romans further developed the idea of the private sphere as opposed to thepublic sphere. In Roman society, the notion of public equated to the good of thestate and its sovereignty, while the notion of private equated to the interests of theindividuals in the empire [36, 77]. These notions were actually eventually acceptedas fundamental enough that they were incorporated into late Roman Law—in the firstchapter of the two sections of the Corpus Juris Civilis, the compilation of RomanLaw issued by Emperor Justinian in 533–534 AD [77]. Indeed, the words “public”and “private” have a Roman origin.2

As the Roman Empire fell into decline and the middle ages advanced upon Europe,the idea of the separation of the state and the individual, the public and the private,fell out of public consciousness for reasons that are quite understandable. Society atthe time simply did not have any significant distinction between public and private,mainly due to the feudal system of rule which was based upon kinship and bondsof loyalty—basically a network of personal dependent links in which there was nodistinction between state and individual [77].

The revival of the separation of public and private began in more modern historywith firstly the occurrence of the enlightenment and later with the growth of capi-talism, as the separation of the sovereignty and the citizen once more occurred andpublic political society came back into prevalence. Habermas famously discussedhow the development of the public sphere was affected through the development ofbourgeois society [36].

In this age of privacy, the basic tenet of privacy was slowly recognised as a con-crete human value, and thus support for treating it as such began to grow. With theseoccurrences the concept of privacy entered a new age as it began to be afforded legalprotections in an effort by the legislators to protect this most basic of human rights.

2.2.2 The legal age of privacy

One of the first areas in the legal arena in which privacy has been deliberated wasin English common law, where the idea of the right to privacy has been discussedfor at least the last few centuries in numerous legal cases and judgements over theyears. One example of this was articulated by Mr. Justice Yates in 1769 (Millar v.Taylor [83]):

It is certain every man has a right to keep his own sentiments, if he pleases. Hehas certainly a right to judge whether he will make them public, or commit themonly to the sight of his friends.

These early English legal cases paved the way for many further privacy devel-opments. Since most American legal practises have their roots in English commonlaw, it has been argued by many that one of the most basic and well-known of thesedevelopments is the U.S. constitution—many arguments have been put forth support-ing the idea that the distinction of the private sphere is enshrined within it. The First

2Public originally comes from the Latin “populus” (public), while private originally comes from the Latin“privus” (single, alone).

Page 8: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

96 R. Smith, J. Shao

Amendment protects freedom of conscience and the Fourth Amendment protects thephysical private sphere of an individual against unreasonable violation. Other of theamendments (the Third, Fifth and Ninth) also arguably have privacy interests.

This basic idea of privacy being incorporated into the very foundation of theUnited States further paved the way for one of the most famous, influential and oft-cited articles about privacy, published in 1890 by Warren and Brandeis [75]. Theirarticle mainly focused upon the privacy violation that can occur due to the publicdissemination of information about an individual that the individual would ratherstay private (the article was inspired by the press intruding in Mr. Warren’s privatelife—the reporting on the wedding of his daughter). During the course of the articlethe authors discuss many aspects of privacy—including control over one’s privatethoughts—and they connect it to many other values, such as an individual’s “rightto be left alone” and the respect due an individual’s “inviolate personality,” althoughthey (perhaps sensibly) do not attempt to define what privacy actually is. Their mainargument is that in order for law to properly protect privacy, privacy needs explicitlegal recognition, as simply applying other legal arguments to protect privacy—suchas using copyright law or contract law—is inadequate. A significant claim that theymake is that privacy is a specific human interest, connected to the moral character,and that this interest is more important in the present than it has been in the past—theimportance of privacy is increasing.

In the wake of Warren and Brandeis’ article, privacy related cases slowly beganto surface over the next several decades that supported the article’s views. This con-tinued until 1965 when what is known as the “constitutional right to privacy” wasexplicitly recognised by the U.S. Supreme Court (Griswold vs. Connecticut, U.S. 381(1965), 479). Until this point, protection of privacy in U.S. law was simply viewedas being necessary to the protection of other more well-established rights, and weretherefore dealt with as such. Justice Douglas wrote that the case in question con-cerned “a relationship lying within the zone of privacy created by several fundamen-tal constitutional guarantees”—the amendments previously mentioned, each of whichcreates different zones or “penumbras” of privacy. Some people are now arguing thatthis ruling should be taken further and another amendment to the constitution shouldbe added that explicitly recognises the right to privacy in a specific and fundamentalmanner.

Outside of the USA, many other countries developed laws that protected people’sprivacy. For example, in 1789 in France La Declaration des droits de l’homme et ducitoyen (The Declaration of the Rights of Man and of the Citizen) was adopted bythe ruling government, which included privacy guarantees to its post-revolutionarycitizens. Meanwhile, on a wider scale, two highly important developments occurred.

Firstly, in 1948 (on December 10th), the General Assembly of the United Nationsadopted the Universal Declaration of Human Rights, wherein privacy was enumer-ated in Article 12:

No one shall be subjected to arbitrary interference with his privacy, family,home or correspondence, nor to attacks upon his honour and reputation. Every-one has the right to the protection of the law against such interference or attacks.

However, since the Universal Declaration of Human rights was so wide-rangingin scope, it never garnered the international consensus necessary to become a binding

Page 9: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 97

treaty. To solve this problem, the declaration was spilt into two binding covenants—with the privacy guarantees becoming part of the International Covenant on Civil andPolitical Rights (Article 17), which has been ratified by 149 parties worldwide.

Secondly, in 1950, the European Convention on Human Rights (officially the Con-vention for the Protection of Human Rights and Fundamental Freedoms) was adoptedby most Council of Europe member states, wherein privacy was enumerated in Arti-cle 8 (right to respect for private life):

1. Everyone has the right to respect for his private and family life, his home and hiscorrespondence.

2. There shall be no interference by a public authority with the exercise of this rightexcept such as is in accordance with the law and is necessary in a democraticsociety in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection ofhealth or morals, or for the protection of the rights and freedoms of others.

The history of privacy thus far has taught us that as societies have risen and fallenin their haphazard progression and current society slowly emerged, the importanceplaced upon the ideal of privacy as a fundamental human right has increased enor-mously, and the subsequent necessity for its protection thus also grown and gainedin importance. However, the nature of protection of privacy has changed as this legalage of privacy gradually gave way to the current modern age of privacy. This transi-tion was initiated by the ever-forward march of technology which has affected boththe concept of privacy and the mechanisms of its protection in some major ways.

2.2.3 Privacy in the technological age

The privacy implications of ever-evolving technology are not exactly new: Warrenand Brandeis’ “right to be left alone” came from a time when privacy was threatenedby a new technology that allowed photographs to be included in mass-circulationnewspapers. During the 115 years since their article, the problems posed by tech-nology have increased wildly, and privacy protection has struggled to keep apace.Tuerkheimer classified these problems into two broad categories: surveillance andpersonal data protection [72].

Surveillance technologies include such things as electronic wiretapping and moni-toring of computer network traffic. Some specific examples of privacy issues broughtforth by these technologies (both already available and forthcoming) include wirelesstelematics networks embedded in lamp posts tracking the location of one’s automo-bile at all times [42], facial recognition systems employed widely tracking the loca-tion of individuals, national identity cards [53], mandatory drug testing of employees,Caller ID, Biometrics, blanket coverage of CCTV [11], monitoring of computer traf-fic, email and telephone calls, and governmental surveillance systems such as Carni-vore and Echelon [26]. All of these technologies are able to violate an individual’sprivacy in some major ways, as in extreme cases a person can be monitored 24 hoursa day without any awareness as to this fact.

So far, there has been little legislation guarding against privacy violations causedby such surveillance technologies, except in a few specific cases: such as when thebasic protections that the Fourth Amendment brings Americans against unreasonable

Page 10: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

98 R. Smith, J. Shao

violation was extended in the latter half of the last century to cover developmentsin electronic surveillance (Katz vs. United States, U.S. 389 (1967), 479), requiringU.S. government agencies to obtain a court order giving permission to use a wire-tap. This does not however solve the problem in many other countries where such athing is not necessary, and it also fails to solve the problem of monitoring by privateorganisations.

Personal data protection is the well-known privacy issue created by the prolifera-tion of databases containing information about individuals’ lives, habits, preferences,and personal histories. From virtually every place one goes to shop, from the su-permarket to the videostore to the bookstore to the pharmacy, a steady stream ofinformation and statistical data about customers pours into vast warehouses of data.These vast treasure-troves of knowledge about everyone can help companies in theirdrive to ultimate efficiency and therefore the good of the customers—at least, that isthe argument used to support this. Undoubtedly this is true. However it can also givea business a huge unfair advantage in their dealings with the customers, as it leavesthem holding all the cards in the game of commerce, while most of the individualsplaying do not know for sure how to play the game, or even that they are playing atall. The time may come when everything it is possible to know about an individualis stored somewhere, and there are no technological guarantees that this informationcannot be accessed by anyone, at any time.

In an attempt to counter this problem, governments in many countries have enactedlegislation that specifically attempts to protect the privacy of this data: for example,in the UK and Sweden there is a legal restriction on any entity possessing any kind ofpersonal information without the explicit consent of the data subject, and every entitythat does store such data has to register this fact with the government. This approachto privacy protection led to the Organisation of Economic Development issuing a setof guidelines (the OECD privacy guidelines) which sets out the minimum standardsfor data collection, storage, processing and dissemination that both the public andthe private sector should adhere to. These guidelines are commonly consulted by na-tions and private organisations when drafting privacy laws and policies. One notableset of privacy laws created based upon some of these principles are those createdby the European Union: in 1981 personal data protection became specifically highlyprotected when the Convention for the Protection of Individuals with regard to Au-tomatic Processing of Personal Data came into being. This convention obliged themembers of the Union to enact legislation concerning the automatic processing ofpersonal data, resulting in different types of regulatory models and approaches [45]being adopted, ranging from the “self-help” approach (where there is no governmentinterference and it is up to the countries’ citizens to challenge inappropriate prac-tises and bring them to the attention of the courts) to the “registration”/“licensing”approach (where a government takes full control of ensuring that personal data aboutits citizens is not misused).

After viewing the range of diverging legislation that had been enacted, the Eu-ropean Commission realised that the different, and sometimes contradictory, ap-proaches and laws enacted by its member states would impede the free flow of datawithin the EU. Therefore the European Commission decided to harmonise data pro-tection regulation across the member states and proposed the Directive on the Protec-tion of Personal Data (officially Directive 95/46/EC on the protection of individuals

Page 11: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 99

with regard to the processing of personal data and on the free movement of suchdata). All members of the EU had to transpose this legislation into their internallaw by 1998, and all did. This directive required that many things become legallymandatory, including the creation of government data protection agencies and theregistration of any databases of personal information with these agencies. One ofthe stipulations of the Directive was that personal data would only be allowed to betransferred to non-EU countries if the country provided an adequate level of pro-tection of the data protection. However, while the EU and the United States bothostensibly share the goal of enhancing privacy protection for their citizens, the EUand the USA take fundamentally different approaches to achieve this. As we haveseen, the EU approach relies on comprehensive legislation to mandate its corporatecitizens be respectful of an individual’s privacy; the USA, however, uses an approachthat mixes small amounts of legislation with large amounts of self-regulation, relyingon its corporate citizens to behave responsibly. This, however, means that it is im-possible to show that the USA can provide an adequate enough level of protection,and means that the directive could have significantly hampered many U.S. companiesfrom engaging in transatlantic transactions with European customers. In an attemptto solve this problem, the European Commission and the U.S. Department of Com-merce jointly developed what is known as the “Safe Harbor” framework,3 which wasadopted in 2000. Under the safe harbor agreement, U.S. companies can choose to reg-ister and enter the safe harbor (self certifying annually): agreeing to comply with theagreement’s rules and regulations (which includes elements such as notice, choice,access, and enforcement). EU organisations can then ensure they are sending per-sonal information to U.S. companies in accordance with EU rules by checking thatthe U.S. company they are dealing with is on the list of participating companies.

In this age of privacy, technology has advanced so far and so fast that the approachof protecting privacy through legal means is not as effective as it once was: techno-logical development is far outpacing the ability of the legal system to react and adaptto new developments—and in many cases overstepping the line where the legal sys-tem is able to protect privacy at all. In an effort to keep up, instead of continuing thepast legal developments calling for an all-encompassing legal and moral protectionof privacy, the legislators have instead had to mitigate against specific privacy viola-tions as they appear—resulting in today’s legal landscape containing a mishmash ofvarious legal protections and requirements that help guard against isolated pockets ofprivacy violations.

2.2.4 Privacy in the information age

In the information age that we live in the nature of privacy changes in some interest-ing ways. Moor has discussed how in the information age information is “greased”—quick moving and with uses impossible to imagine when it was initially entered ontoa computer [46]. The privacy implications of this one development alone are stagger-ing. It is now possible for one to give personal information to one entity (purposefullyor without realising it) for one specific purpose or reason, only for it to be transferred

3http://export.gov/safeharbor.

Page 12: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

100 R. Smith, J. Shao

to another entity and used for an entirely different (possibly objectionable) purpose,ad infinitum. Moor therefore argues that we need to create “zones of privacy” whichallow individuals to control levels of access to private information differently in dif-ferent situations. He argues that it is important to think of privacy as an amalgamationof the competing ideas that privacy is either about controlling one’s information orabout the state of limited access to one’s information, saying that although the con-trol theory is highly desirable it is impossible to totally control one’s information ina computerised society.

Obviously, Moor’s conclusion is based upon the assumption that individuals re-ally do care about the privacy of their personal information in this greased world ofthe information age. However, not everyone believes this to be the case. One pointoften raised that argues this is that while most individuals state they are concernedabout privacy, many of them then go online and give away personal information withapparently no thought or hesitation. This, they argue, seems to suggest that retain-ing privacy of their personal information online is not as important to people as theylike to think, and therefore it can safely be ignored. To counter this argument, Syver-son briefly discussed this view and produced some preliminary evidence debunkingit in [70]. To investigate more formally the statistics of this occurrence, Ackerman,Cranor and Reagle surveyed a number of U.S. Internet users, recording their privacypreferences and then giving them a range of e-commerce scenarios and examiningtheir privacy concerns [1]. The results show that while consumers in general havea high level of concern about privacy, when faced with real online situations thingscomplicate somewhat. They state that consumers can be split into three main cate-gories (supporting Westin’s earlier findings in [81]), termed the privacy fundamen-talists, the pragmatic majority, and the marginally concerned. Privacy fundamental-ists (17% of participants in the survey) are extremely concerned about privacy, un-willing to provide private information under almost any circumstances. Pragmatists(56%) have privacy concerns, but were willing to give certain private informationfor pertinent reasons when assured by privacy protection measures. The marginallyconcerned (27%) were willing to give private information under almost any circum-stances. Thus, while some individuals may reveal private information at any time, themajority of people (73%) are at least concerned about their privacy online.

Due to the “greased” nature of information in the age we live in, the methods tobest protect privacy are changing. Previously, most privacy violations against individ-uals were due to calculated acts by specific entities, without the individuals consent,and therefore the best defence against these was legislation protecting individualsagainst them. Privacy violations in the world of the information age, however, canequally as likely be due to entities misusing data collected from individuals for ap-parently legitimate reasons. Thus, protection of privacy becomes a new challenge asindividuals have freely given this information away for a specific purpose—and oncethis information is out in the wild, it is impossible to recapture and difficult to monitorwhether it has been utilised legally or exploited illegally.

These conclusions seem to suggest that a new model of privacy protection is nec-essary in the information age if the laudable goal of protecting an individual’s privacyis to be achieved. Instead of relying upon legal protections alone to guarantee that noentity can violate one’s privacy, technologies should perhaps instead be used that putindividuals back in control of their information: thus allowing them to protect their

Page 13: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 101

own privacy to whatever extent they wish, releasing certain amounts of personal in-formation for whatever reasons they wish, tailored to the specific circumstance theyare faced with.

3 Privacy in e-commerce

The dawning of the information age has enabled us to realise that on the one hand,information itself is valuable: possession of intellectual, personal, social and eco-nomic information can both create opportunities for the individual and give socialand economic advantages to them; but on the other hand it has led to a world wherean individual can unwittingly give away this valuable information while engaging in,or attempting to take advantage of, the use of new technologies. This situation is bestillustrated in one specific area of recent technological development: e-commerce—where consumers and businesses exchange performative statements via electronicmedia to express their commitments to the achievement of future states involving theexchange of goods with one another [44].

Transacting in e-commerce today typically necessitates the divulgence of largeamounts of personal information which is either necessary for the transaction (forexample, credit card information, delivery details) or is desired by the e-business:possession of this information gives them the opportunity to analyse it, discoveringtrends and increasing the efficiency of their business dealings. Consumers typicallyhave little idea as to the range of possible uses that possession of this informationallows for, and thus have little idea as to the possible violation of their privacy thatcould occur right under their noses with their unintentional consent. One of theseuses is price discrimination—Odlyzko noted that in e-commerce “privacy appears tobe declining largely in order to facilitate differential pricing” [50], and discussed thefact that while such price discrimination is largely seen as economically efficient it isalso widely disliked by the public.

Conventionally, this friction between the incompatible ideas of either supportingthe corporate viewpoint or the individual viewpoint in the conflict between businessinterests and individual interests has led to the idea that the two viewpoints are mu-tually exclusive and thus either one or the other should be encouraged. Generallyeconomists have tended to uphold the corporate viewpoint since business interestsenhance the economy the most, while privacy advocates and civil liberties groupshave upheld the individual viewpoint arguing that everyone has fundamental humanrights to privacy and non-discrimination. In the new world of the information age ande-commerce however there seems to be a need to—and the possibility to—reach acompromise between the two competing ideas and to arrive at a solution somewherein the middle-ground that is mutually beneficial to all. This compromise is to supportconsumer-centric privacy in e-commerce: the enabling of the ability of the individualto retain the maximum amount of privacy and control possible over their personalinformation.

From an individual’s point of view, consumer-centric privacy would mean puttingthem back in control of their information to use and give away as they see fit. Asidefrom the socio-philosophic benefits discussed in the previous section, the maximisa-tion of privacy and control could also produce significant economic benefits:

Page 14: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

102 R. Smith, J. Shao

• It enables an individual to take advantage of the highly valuable assets theypossess—their personal information—that is currently freely given away. Keep-ing control of these assets would allow an individual to possess more leverage ineconomic transactions. Simply stated, this increased leverage should enable indi-viduals to directly “get a better deal” for themselves.

• More generally, if consumer-centric privacy existed in the world of e-commercethen the security and privacy trust barrier discussed in the introduction would bebroken down, thus many more individuals would be comfortable being a part of theworld of e-commerce, leading to increased e-commerce levels. This increase wouldbenefit the consumer as generally increased e-commerce levels should lead to moree-commerce providers, thus leading to lower prices due to increased competition,and all that entails.

From a business’ point of view, consumer-centric privacy would mean consider-ing their customers as entities engaging in a two-way business interaction rather thanentities to exploit in a one-way process. While initial consideration of this enable-ment of consumer-centric privacy would seem to only contain disadvantages froma business point of view—losing the control e-businesses currently wield over theircustomer’s information would result in them being in less of a position to exploit thisinformation through data mining, price discrimination, etc.—a more careful analysisexposes the possibility of some significant benefits:

• The main possibility is fairly simple: cold hard cash. Privacy concerns of individ-uals are currently holding back e-commerce levels by significant amounts. Oncethese privacy concerns are alleviated then the growth of e-commerce levels shouldstart to reach the uninhibited amounts that it is capable of. This increase wouldalmost certainly be of a significant amount, resulting in a significant increase ine-commerce levels.

• A by-product of shifting the onus of protecting privacy in e-commerce from theside of the e-business to the side of the individual, and thus the technology achiev-ing this protection, is that the costs of setting up the systems that provide server-side personalisation and all related technologies required should decrease (or evendisappear); causing the running and maintenance costs of the e-business to drop.

Thus when considered as a whole, the possible benefits of encouraging consumer-centric privacy in e-commerce are fairly captivating. There are, however, significantchallenges associated with doing this, both technological and in the fact that it wouldinvolve providers embrace a complete paradigm shift in the way they conduct busi-ness in the world of e-commerce. To understand some of the technological challenges,we will survey the current technologies that have been produced and draw some over-all conclusions about them.

4 Privacy enhancing technologies

Desire for consumer privacy had led some in the research community to design tech-nologies that aim to uphold the ideal of protecting it in the e-commerce environ-ment. These technologies can generally be placed into one of two categories: anony-mous technologies and non-anonymous technologies, or to use the correct antonym

Page 15: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 103

of anonymous—“onymous” technologies. The methods by which these two types oftechnologies attempt to preserve individual privacy differ both in fundamental phi-losophy and in application.

4.1 Anonymous/pseudonymous techniques

Anonymising/pseudonymising technologies attempt to achieve unlinkability betweenan individual and any of their personal information. That is, they aim to securethe privacy of an individual’s personal information by simply trying to ensure thatany personal information released to an organisation cannot be linked to an indi-vidual’s real identity. There are a range of levels of anonymity available: from thetruly anonymous—no one can find out who you really are; through the pseudo-anonymous—your identity is generally not known but can be obtained if deemednecessary and with enough hard work; through to the pseudonymous—where anindividual can create a range of virtual identities for use in various circumstances.Throughout this spread individual privacy is maintained—as although an attacker try-ing to harvest personal information is able to gather large quantities of it, the materialgathered cannot (normally) be linked back to a specific individual.

User anonymity (at whatever level) can be achieved through one of three mainmethods: anonymising the transport medium; allowing anonymous but accountabletransactions (credential systems); and “scrubbing” of data stored by an organisation.

4.1.1 Anonymising the transport medium

One method of enforcing anonymity between consumers and their prospectivee-commerce providers is to ensure that any communications between the two occurin such a way that the original identity of the consumer cannot be gleaned throughexamination of any of the communications, or by eavesdropping on communicationpatterns. So for example, when a consumer browses through an e-business’ websiteand buys items from them, this type of technology will aim to prevent the e-businessfrom ever knowing exactly who they are dealing with. Technologies have been cre-ated that attempt to achieve this goal, with varying degrees of success.

One of the simplest possible of ways for an individual to achieve anonymity inthis fashion is to simply set up an account with a free email service such as Hotmail,4

Yahoo Mail5 or Mail.com.6 This allows an individual to have a point of contact forcommunications, enabling them to be members of online communities; have accountswith e-businesses, etc. Hushmail7 takes this idea further offering end-to-end encryp-tion for its users ensuring eavesdroppers cannot breach the privacy of this setup. Thissimple approach to anonymity through free email systems however requires that theindividual trust the email service provider completely—that they will not log com-munication details such as IP addresses during email sessions. Additionally, this ap-proach is fast becoming impossible in recent times as the majority of these services

4http://www.hotmail.com/.5http://mail.yahoo.com/.6http://www.mail.com/.7http://www.hushmail.com/.

Page 16: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

104 R. Smith, J. Shao

increasingly require personal details to sign up. That being said, there is generallynothing to stop users signing up with false details.

A step up in technological complexity leads us to a well known tool to achieveanonymous web browsing—Anonymizer.8 When an individual uses this service toview items on the internet or submit information to remote sites, it is done with allcommunications being routed through Anonymizer’s servers—thus the remote sitehas no way of detecting the IP address or identity of the user. However, this kind oftechnique also requires a trusted third-party—in this case Anonymizer itself. This issimply because Anonymizer’s servers (or the user’s ISP) can certainly identify theuser if they so desired.

To achieve complete anonymity on the internet, tools are needed that do notrely on a trusted third-party. In this vein, Reiter and Rubin created a system calledCrowds [58, 59] that operates by grouping users into large groups (crowds). Individ-uals connect to this crowd, and instead of directly issuing requests to internet servers,they instead give it to the crowd. The request gets passed around the members of thecrowd with an arbitrary degree of randomness until it eventually gets submitted tothe intended recipient. To use Crowds is essentially to play “pass the parcel” withusers’ requests. The recipient of the request thus cannot identify who in the crowdissued the initial request—as it is equally likely to have been any of the members ofthe crowd. However, malicious behaviour by any rogue crowd members can affectthe usefulness and reliability of the system—although they cannot compromise theanonymity of any of the other members through such behaviour.

Another step up in complexity of technology leads to technologies that use en-cryption to assist with solving the problem. A well known technology of this typethat spurned numerous off-shoots was created by Chaum in 1981 [16]. A Chaum Mixis a system based upon public key cryptography that allows users to communicate viaemail while remaining anonymous to each other (and any global eavesdropper), allwithout needing any guarantees as to the security of the underlying communicationssystem. It does this by ensuring that messages passing through the system are of equalsize, cryptographically changing them and then sending the messages to their recip-ients in a different order. This makes it very difficult for even a global eavesdropperto link an incoming message and its sender to an outgoing message and its recipient.Chaum Mixes can be improved by linking mixes together to create a “cascade” ofMixes in order to further provide security guarantees and not to have to require a userto trust one single mix server.

One of the off-shoots of Chaum Mixes was created by Goldschlag, Reed andSyverson [33, 34]. They designed an anonymous communication architecture calledOnion Routing which can be used by any protocol capable of being adapted to use aproxy service. It is built upon the idea of using a network of dynamic real-time ChaumMixes. Onion routing allows bi-directional, near real-time connections that are highlyresistant to eavesdropping and traffic analysis. The user submits an encrypted requestin the form of an onion—a layered data structure specifying the properties of theconnection at each point along the route (including cryptographic information andkeys). Each point can only decrypt its layer, finding out only where the next point in

8http://www.anonymizer.com/.

Page 17: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 105

the route is, except for the final point which decrypts its onion to find the request tosend and whom to send it to, which it then does. Thus, a recipient only knows theidentity of the person at the end of the chain. The Freedom Network9 designed andoperated by Zero-Knowledge Systems Inc. is a commercial implementation of a vari-ant of Onion Routing, routing IP connections through intermediate nodes to provideusers with anonymity.

This first generation of Onion Routing, however, never developed much beyonda proof-of-concept system that ran on a single machine. A new second generationOnion Routing system (supported by the Electronic Frontier Foundation) called Torhas recently been presented [24]. This new generation of communication service ad-dresses many of the limitations of the original design, adding many useful featuresdesigned to make Onion Routing secure, efficient, and usable enough for real worlduse.

All of these anonymising technologies however have one major caveat—they onlywork properly (providing anonymity) if certain conditions are met. Some of theseconditions have been noted by Clayton, Danezis and Kuhn [19]: for example, attackscan be made that will reveal a supposedly anonymous individual’s IP address bydoing such things as making use of client-side scripting, sending images which canbe tracked when loaded by the user (web bugs), cookie stealing, and many othermethods. The conclusion by the authors in [19] is that it is important to not only thinkabout the anonymity properties of communication channels, but to also consider waysof protecting anonymity throughout the entire system. To use the old adage, “A chainis only as secure as its weakest link.”

4.1.2 Credential systems

Besides anonymising the transport medium, another method of enabling anonymitybetween a customer and an e-business is through the use of a credential system (some-times referred to as a pseudonym system). In a credential system, users are known tothe organisation they are doing business with only by a pseudonym (or nym). A sin-gle user can use different pseudonyms with different organisations, and these cannotbe linked together by any member of the system. However, an organisation can is-sue a credential to a pseudonym, who can then prove possession of this to anotherorganisation revealing only that the user owns a credential. A certification authoritysometimes plays an important role in guaranteeing that the system is used properlyand that users can be trusted. What this means in e-commerce transactions is thatthese technologies enable customers to buy items from an e-business by proving cer-tain facts: that they are eligible to buy the item, that they have given a payment, etc.,all without the e-business knowing exactly who they are dealing with.

The idea and basic framework of credential systems were first introduced byChaum in 1985 [14], who soon after published a full model with security proofs usingRSA as a one-way function [15]. This model, however, requires a trusted third party(the certification authority) which manages the transfers of users’ credentials betweenorganisations. To relax this constraint, Damguård developed a model that only needs

9http://www.freedom.net/.

Page 18: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

106 R. Smith, J. Shao

a third party to be involved—it does not necessarily have to be trusted [22]. It doesthis by using the idea of multi-party computations [35], resulting in a scheme that isprovably immune to forgery by malicious entities in the system. This model howeveris not meant for implementation as some of the methods used are too inefficient forheavy use. A practical version of this work was produced by Chen [17], it howeverrequires that the certification authority behave honestly.

One weakness of all of these models of credential systems is that there is nothingto stop a user from sharing his pseudonym and/or credentials with other users. Forexample if a user was issued with a certificate proving that they are over 18, theycould then share this with other under-age users who could use it to purchase age-limited products. While this is in fact acceptable from the perspective of protectingprivacy, it may result in business transactions that are not valid or intended. In anattempt to solve this problem Lysyanskaya, Rivest and Sahai produced a model [43]that includes the presumption that the user’s private key (linked to the correspondingpublic key in the credential system) is something that they are motivated to keepsecret—for example, it could be their digital signature key. If a user then shared acredential with another user, the other user would have access to this secret. If thesecret was the digital signature key, the other user could then forge signatures on anydocuments in the original users name. This idea was termed non-transferability. Aswith Damguård’s model however, this model is not directly usable in practise due tothe reliance on methods that are too inefficient to use in practise.

Camenisch and Lysyanskaya further developed this idea of non-transferability [12]producing another model for a credential system that additionally has the optionalproperty of allowing a users identity to be revealed if the user misuses their credentialor uses it in illegal transaction. This model, however, requires that the certificationauthority is trusted to do their job properly—this risk can be minimised, however,through distribution of the tasks of the certification authority, weakening the trustassumptions. Camenisch and Van Herreweghen have recently described a prototypeof a system based upon this model [13].

More recently, the idea of credential systems, along with the lack of success ofPKI (Public Key Infrastructure) systems, has recently led to the development of PMI(Privilege Management Infrastructures) and AAI (Authentication and AuthorisationInfrastructures) systems. As such, several proposals for working systems have beenput forward: such as Kerberos, SESAME, PERMIS, AKENTI, Microsoft Passport,Shibboleth and the Liberty Framework. Schlaeger and Pernul surveyed these new pro-posals, concluding that none of them “is perfectly suitable for b2c e-commerce” [62].

4.1.3 Privacy in databases

A different approach to enabling anonymity of a user “after the fact” consists es-sentially of finding a method of allowing an organisation to hold a vast database ofinformation about its customers, yet guaranteeing that any information gleaned fromthe database cannot be linked back to a specific individual. This means that cus-tomers can buy items from an e-business in a totally standard way but when theirdata is stored in a database all information that uniquely identifies the owner of theinformation is removed thereby achieving anonymity. This technique was originallypioneered to help solve privacy issues in statistical databases. In these databases it is

Page 19: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 107

the statistical information about the data—rather than the data itself—that is impor-tant, thus methods have been considered that can keep the statistics of the data-setvalid whilst keeping the individual data itself private. Two excellent surveys of theseares were produced by Adam and Worthmann [2] and Shoshani [68]. Broadly, themethods that attempt to accomplish this ideal of privacy of database information canbe split into three main categories: query restriction, data perturbation and outputtransformation [2].

The goal of technologies that fall into the query restriction category is to retainprivacy of individual data items by restricting the information that can be released.In this approach only queries that obey specific criteria are allowed, in an effort toprevent information about specific data items becoming known. The problem of thisapproach is that only a small subset of possible queries are allowed, reducing theusefulness of the database. Similar to this idea is query auditing [18], where an audittrail of all queries that have been performed is kept and every new query is checked forpossible compromise. If a possible compromise is detected, the query is disallowed.The main problem with this approach, however, is that an entity can use the resultsof queries along with the knowledge of what queries have been allowed and deniedto infer data and thus compromise the privacy of the data contained in the database.

The goal of data perturbation is to modify the original database in such a way thatthe overall statistics remain valid while individual items are changed, thus preserv-ing privacy of individual records. A very basic method of achieving a limited degreeof confidentiality is simple rounding of numerical data. This naive approach can beimproved slightly by randomly rounding or by adding random values with a mean ofzero [73]. This idea has been further developed, for example in [3, 4]. Other meth-ods have been proposed that fall into this category, including data swapping, whichinvolves swapping each item in the database with another one from the same distribu-tion, thus creating a new database with supposedly the same statistics [57]. However,by its very nature the process of modifying the original data will always alter theoverall statistics at least slightly, as well as making the original data meaningless inof itself, making the information gained less useful overall.

The final category of output perturbation allows a database to permanently storethe original data and perform queries on it, however the results of any query per-formed are altered such that the original data cannot be inferred before being returnedto the user. Methods that achieve this include adding a random perturbation to queryresults (with increasing variance as queries are repeated) [8].

All of these techniques have one major technical drawback however—they do notadequately satisfy the conflict of being able to provide high quality statistics whilesimultaneously preventing disclosure of individual information [2]. Also, attemptingto achieve the privacy of individual data itself is a very hard problem to overcome.Denning and Denning discussed the details of this problem—whereby statistical in-formation “contain[s] vestiges of the original information; a snooper might be ableto reconstruct this information by processing enough summaries” [23]—drawing onwork both by themselves and Schlorer (e.g. [63–65]).

From an e-commerce and consumer point of view there is an even bigger draw-back with these methods: any individuals wishing to enter into a business relationshipwith an organisation have to trust that organisation’s word that once their personal in-formation is received and stored it will be anonymised.

Page 20: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

108 R. Smith, J. Shao

4.1.4 Summary

Where anonymous/pseudonymous techniques are both desirable and technicallyplausible, they are an excellent method of preserving individual privacy. A range oftechniques have been proposed ranging from the simple to the highly technologicallycomplex. A summary of the techniques surveyed is presented in Table 1, where wecompare each of the techniques by their basic architecture, whether they are usable inpractical circumstances, and in which application areas they can be used successfully.As can be seen, the majority of the anonymous/pseudonymous privacy enhancingtechniques are practically usable in areas involving web applications (which includese-commerce), but require a trusted third party in their architecture. This is potentiallya serious limiting factor for the usefulness of this type of technology: it requires thatentities exist that consumers trust entirely (and all the problems this entails); also ase-commerce develops further and e-businesses become ever more distributed, it maynot be realistic to require the existence of a trusted third party in order to protectprivacy for consumers.

Of course, there are other privacy-enhancing techniques available to be used. Onesuch area of techniques is the area of steganography [40, 41]—the science of hiding

Table 1 Anonymous/pseudonymous privacy enhancing technologies

Technology References Architecture Usable Application areas

T3P 3P De E-mail Web Other

Anonymous techniques

Anonymous e-mail email.com, etc. x x x

“Anonymizer” anonymizer.com x x x

“Crowds” [58, 59] x x x

Simple Chaum mix [16] x x x

Network of Chaum mixes [16] x x x

“Onion Routing” [33, 34] x x x x

Credential systems (CS)

Chaum’s CS [14, 15] x x x

Damguård’s CS [22] x x

Chen’s CS [17] x x x

Lysyanskaya et al.’s CS [43] x x

Camenisch et al.’s CS [12] x x x

“Idemix” [13] x x x

Database privacy

Query restriction [18] x x N/A N/A N/A

Data perturbation [3, 4, 57, 73] x x N/A N/A N/A

Output perturbation [8] x x N/A N/A N/A

Key: T3P = Trusted third Party

3P = Third Party

De = Decentralised

Page 21: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 109

communication between two parties in such a manner than an eavesdropper would notknow that a message exists. However, while these techniques are highly developed,none are realistically applicable to the field of e-commerce as it stands today (e.g. inthe case of steganography, hiding the fact that an individual e-commerce transactionwas occurring from an eavesdropper would only guard against privacy invasions fromthat eavesdropper—which is not where most privacy invasions are occurring).

4.2 Onymous techniques

Preserving privacy by staying anonymous is not always possible. Sometimes an indi-vidual is required to be identified by an e-business in order to receive their services.Examples of cases where this may be true are the use of digital libraries where pay-ment is required to access the data (and therefore credit card details, and thus identitycan be revealed) or standard Business-to-Consumer (B2C) e-commerce—if an indi-vidual orders a book from Amazon, payment details as well as delivery details arerequired. While anonymous e-cash systems have been developed that could solve thefirst problem, any material goods one orders still need to be delivered: therefore anaddress needs to be supplied, and thence identity can be discovered. Solutions to thisproblem have also been proposed—a simple example being that an individual set upa PO Box. However, none of the solutions to this problem developed so far would re-alistically work in the real world of e-commerce, where the average individual wouldsimply not bother engaging in e-commerce if it meant having to go well out of one’sway to retain privacy by enacting one of the solutions.

Thus, onymous technologies have started to be developed to counter this problem.The main philosophy behind onymous technologies is not to attempt to withholdan individual’s identity from an organisation, instead attempting to help individualspreserve the privacy of some of their information—or at the very least to help them tomake informed decisions about which entities can be trusted. This area of technologyis becoming increasingly important in recent times as more resources on the Internetslowly move away from the free-service model and as security concerns are pressingfor fully accountable and identifiable transactions.

Onymous technologies fall largely into two distinct groups: those that help con-sumers to make informed decisions when transacting in e-commerce, and those thatactively attempt to actually enforce the preservation of privacy.

4.2.1 Decision helping techniques

One of the simplest methods of helping maintain consumer privacy onymously arethose which aim to guide a consumer in the decision about which e-businesses canbe trusted to be (relatively) respectful of their privacy and which to avoid. One suchmethod is embodied by TRUSTe, the “online privacy seal.” TRUSTe is simply a pro-gramme designed to help an individual make a choice over which websites they cantrust and enter into business with. The TRUSTe organisation issues a “trustmark”to e-businesses that adhere to TRUSTe’s privacy principles—practises approved bythe U.S. Department of Commerce and the Federal Trade Commission—and whichallow oversight to ensure that they follow through on their promises. Moores andDhillon [47] conducted a study into the effectiveness of these methods and concluded

Page 22: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

110 R. Smith, J. Shao

that while they can be effective for the organisations that participate in these pro-grammes and abide by their stated privacy principles, the overall perception of trustin e-commerce is still heavily damaged by the majority of organisations that do notparticipate. Thus, this solution to the trust problem of e-commerce is not effectiveenough to make a significant difference.

A more technological solution of the decision helping techniques proposed is thePlatform for Privacy Preferences (P3P) [56], from the World-Wide Web Consortium(W3C). Its main philosophy is that if individuals have to give up some privacy inorder to transact with an e-business, they should be able to at least make an informedchoice as to which e-businesses they wish to interact with. To achieve this, P3P en-abled e-businesses make available their P3P policy—a set of privacy practises thattheir website and company adhere to. P3P enabled individuals create their own policy,deciding what privacy practises they find acceptable. These two items come togetherwhen a user visits an e-business’ website, where a P3P agent under the individual’scontrol compares the two policies, informing the individual about their similaritiesand whether they match. This process allows individuals to tailor their relationshipswith different e-businesses, releasing different amounts of personal information ac-cordingly. P3P however has one main drawback—an e-business’ P3P policy onlystates what their policies are, it does not ensure these policies are actually enforced.Once again we have reached a point where the user has to trust the e-business to keepto its promises.

4.2.2 Enforcement techniques

In an attempt to counter the main drawback with P3P, Ashley, Powers and Schuntercreated an extended variant of it which works towards enterprise-wide enforcement ofP3P policies [6, 7]. Organisations create an “Enterprise Privacy Policy” which is thenenforced by a protected system holding the consumer’s data. The system grants ordenies attempts to access information and creates an audit trail that can be requestedby the consumer. While this is a good step forward for P3P enforcement—ensuringthat employees of the company can only access a consumer’s data for agreed uponreasons—the consumer must still assume that the company has indeed protected itssystem, and therefore the company as a whole is trustworthy.

A more consumer-centric privacy-enforcement solution was presented by Elovici,Shapira and Maschiach [25]. They presented a model for hiding information aboutgroup interests of a group of individuals who share a common point of access tothe internet. The model works by generating faked transactions in various fields ofinterest in an attempt to prevent the real group profile being inferred. The raison d’êtrefor the model is to allow individuals within the group to identify themselves to, andthence make use of, various services—such as digital libraries, specialised databases,etc.—without allowing eavesdroppers to infer a common group interest. This couldbe used for example to prevent someone inferring the direction of research withinresearch groups in rival companies. The measure of the model’s success is basedupon measuring the “degree of confusion” the system can inflict upon eavesdroppers.

Another technological solution to preserving privacy by enforcement was pre-sented by Rezgui, Ouzzani, Bouguettaya and Medjahed [60]. Their system con-centrates on preserving privacy of a citizen’s personal information in web-services

Page 23: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 111

in general, and in e-government applications in particular. Its main thrust is in en-forcing privacy by requiring entities which wish to access data to provide credentialsproving that they are allowed to access it, filtering out data that they are not allowedto access, and finally by delivering the data through mobile privacy preserving agentswhich enforces the privacy of the data on the remote site. However, the security ofmobile agents is a major problem that has not yet been addressed adequately (see [60]for an overview of this problem), consequently any solutions that include the use ofmobile agents cannot currently contain any security guarantees.

Another onymous technology produced dealing specifically with the area of pre-serving privacy of a consumer’s preferences when utilising newly developed searchtechnologies in e-commerce was proposed by Smith and Shao [69]. The approachused in this case is to maximise a consumer’s privacy through a process of gradualrelease of their preferences, attempting to minimise the amount released while stillacquiring (near) optimal search results. The approach currently lacks any concretemeasures of privacy however, so its effectiveness has yet to be seen in real worldsituations.

4.2.3 Summary

Onymous techniques can be very useful in e-commerce as they support the idealof consumer privacy—attempting to maximise the amount preserved—whilst stillallowing fully onymous verifiable transactions to occur between consumer ande-business. However, compared to anonymising technologies there have been rela-tively few technologies of this type proposed—possibly because realistically usablemethods of maintaining consumer privacy onymously spring to mind less readily thantheir anonymous counterparts. A summary of the techniques surveyed is presented inTable 2, where we compare each of the techniques by their basic philosophy (whetherthey are decision helping or enforcement), whether they require a trusted third party,and whether they can be applicable to the general area of e-commerce. As we can see,with the relatively few onymous privacy enhancing technologies so far proposed,only two actually fulfill the ultimate goal of enforcement of consumer privacy ine-commerce without requiring a trusted third party to operate: the technologies pro-posed by Elovici, Shapira and Maschiach [25] and Smith and Shao [69]. Given thatonymous technologies are more likely to have a bigger impact on e-commerce levelsfor the reasons stated in this paper, more work in this area is clearly desirable.

4.3 The future of privacy enhancing technologies

In this section we attempt to identify some directions for future work on the develop-ment of privacy enhancing technologies for e-commerce applications, based on ouranalysis of the existing techniques surveyed in the previous two sections.

Current privacy enhancing technologies are making significant achievements inthe arena of preserving the privacy of individuals through achieving user anonymity.In the cases where anonymity is possible (and indeed desirable), and the caveatsmentioned are managed satisfactorily, this is a very viable and effective approach anddefinitely merits the attention it currently receives. However, more work is requiredinto the area of designing systems that can manage these caveats automatically, and

Page 24: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

112 R. Smith, J. Shao

Table 2 Onymous privacy enhancing technologies

Technology References Privacy philosophy T3P Applicable to e-commerce

Helper Enforcement

TRUSTe truste.com x x x

P3P [20, 56] x x x

E-P3P [6, 7] x x x

ESM’s [25] x x

ROBM’s [60] x x

SS’s [69] x x

Key: T3P = Trusted third Party

also into relaxing the requirement of trusted third parties in their architectures. Thiswill then provide total anonymity/pseudonymity, rather than just concentrating on thebasic methods and protocols that provide anonymity at the base level.

It is not always possible, however, for a user to remain anonymous. This is espe-cially true in the area of e-commerce—where names, payment details, delivery de-tails, etc. are required when one buys material goods over the internet and enters intoa contract with that company. Thus we feel that onymous technologies are neededto allow an individual to preserve their privacy whilst simultaneously fulfilling thepractical necessity for fully accountable and identifiable transactions.

So far, in this area of preserving privacy in onymous circumstances only a fewtechnologies have been produced: the main contender (at least as far as take-up isconcerned) being the Platform for Privacy Preferences (P3P). However, this technol-ogy does not actually enforce individual privacy, it is a technology to help guide de-cision making about whom to trust. More technologies, and specifically technologiesthat attempt to enforce the preservation of privacy, are required to fill this void.

If we accept that there is a need to produce new and more advanced onymoustechnologies then we need to understand fundamentally how consumer privacy in thee-commerce context may be measured. That is, we need to consider, for example: ifsome types of information are worth more than others, whether there is any corre-lation between what businesses and individuals value the most, and if certain kindsof private information hold more economic, and thus bargaining, power. Establishingthese measures will then give us a sound basis for developing onymous technologies,and allow us to fully evaluate and compare any technologies that may be created infuture.

Finally, there is a need to study privacy enhancing technologies in the light of se-curity requirements. Allowing any individual to transact anonymously on the internetis sometimes considered a potential security problem in today’s world. However, inthe specific context of e-commerce, it is easy to argue that any individual who trulydesires to be anonymous can simply choose to not transact electronically and in-stead go into a physical shop in person and pay with cash. Thus, stopping anonymouse-transactions may not get rid of this potential security problem; it only removes itfrom e-commerce shifting it to a different area. We therefore believe that more studiesare needed in order to understand how the two sets of requirements can be meaning-

Page 25: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 113

fully balanced in e-commerce, and we feel that onymous technologies can play animportant role in viable solutions.

5 Conclusion

In this paper, we have briefly given an overview of the history and development ofthe idea of “privacy” from ancient times through to today’s world and explored ex-actly how this idea of privacy fits within the developing world of e-commerce. Ouranalysis has suggested that protecting individual privacy in e-commerce is importantand beneficial to both consumers and e-businesses.

From a consumer’s point of view, releasing the absolute minimum amount of per-sonal information, interests and preferences as possible to an e-business is simply ameans to an end. The idea of releasing as little information as possible is the “means,”while the “end” is entirely dependent upon an individual’s personal viewpoints. If onebelieves that privacy is an important human value and that an individual should havetotal control over personal information (the control theory of privacy) then the end isto retain one’s individuality in society (and all that this entails) by keeping control ofone’s information. If one believes that privacy is characterised by other people know-ing as little about them as possible (the limited access theory of privacy) then theend is to keep the maximum amount of personal information private. If one believesthat privacy is less noble than these grand social theories and is simply about theeconomics of information (the economic theory of privacy) then the end is to retainan economic advantage in one’s dealings with business by retaining control of one’sproperty and assets. Since the majority of people agree with at least one of the “ends”of privacy, the ability to retain individual privacy is highly important.

From a business’ point of view, to support technologies that enable individuals toretain privacy makes good business sense. Surveys have shown that e-commerce islosing a significant amount of income because of privacy and security concerns ofits potential user base. Whatever the business feels about the “right to privacy” of itscustomers, embracing technologies that give some degree of control to its customersshould alleviate a lot of consumer concerns with the knock-on effect that the take-up of e-commerce should increase. This benefit may become more evident in thenewly emerging computing paradigm of service oriented computing across the Gridinfrastructure [29]—where more businesses will be expected, if not forced, to operateelectronically. To e-businesses, therefore, the ability to give consumers control oftheir privacy in an attempt to create an acceptable level of trust is highly essential.

We have also surveyed some of the existing technologies for enhancing consumerprivacy in e-commerce in this paper. While these technologies represent a promisingstart to the enhancement of consumer privacy in e-commerce, much still needs to beaccomplished to advance the state of the art. Anonymising technologies are highlyeffective in the goal of preserving privacy in many cases but they are not the onlypossible type of solution—and in some cases are not a viable solution at all. In suchsituations—where anonymity is either not possible or not desirable—onymous (non-anonymous) technologies could play a major role. Currently there is little work ondeveloping onymous technologies and further work in this area would be beneficialto all parties involved in e-commerce.

Page 26: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

114 R. Smith, J. Shao

References

1. Ackerman, M.S., Cranor, L.F., & Reagle, J. (1999). Privacy in e-commerce: examining user scenariosand privacy preferences. In Proceedings of the 1st ACM conference on electronic commerce (pp. 1–8).New York: ACM

2. Adam, N.R., & Worthmann, J. C. (1989). Security-control methods for statistical databases: a com-parative study. ACM Computing Surveys, 21(4), 515–556

3. Agrawal, D., & Aggarwal, C.C. (2001). On the design and quantification of privacy preserving datamining algorithms. In Proceedings of the twentieth ACM SIGMOD-SIGACT-SIGART symposium onprinciples of database systems (pp. 247–255). New York: ACM

4. Agrawal, R., & Srikant, R. (2000). Privacy-preserving data mining. In Proceedings of the 2000 ACMSIGMOD international conference on management of data (pp. 439–450). New York: ACM

5. Arendt, H. (1958). The Human Condition. Chicago: University of Chicago Press6. Ashley, P., Hada, S., Karjoth, G., & Schunter, M. (2002). E-p3p privacy policies and privacy autho-

rization. In Proceeding of the ACM workshop on privacy in the electronic society (pp. 103–109). NewYork: ACM

7. Ashley, P., Powers, C., & Schunter, M. (2002). From privacy promises to privacy management: a newapproach for enforcing privacy throughout an enterprise. In Proceedings of the 2002 workshop on newsecurity paradigms (pp. 43–50). New York: ACM

8. Beck, L.L. (1980). A security mechanism for statistical databases. ACM Transactions on DatabaseSystems, 5(3), 316–338

9. Benn, S.I., & Gaus, G.F. (1983). Public and Private in Social Life. St. Martins Press10. Bloustein, E.J. (1964). Privacy as an aspect of human dignity. New York University Law Review, 39,

962–100711. Brin, D. The transparent society. http://www.wired.com/wired/archive/4.12/fftransparent.html12. Camenisch, J., & Lysyanskaya, A. (2001). An efficient system for non-transferable anonymous

credentials with optional anonymity revocation. In B. Pfitzmann (Ed.), Advances in cryptology—EUROCRYPT 2001. Lecture Notes in Computer Science, vol. 2045 (pp. 93–118). Heidelberg:Springer

13. Camenisch, J., & Van Herreweghen, E. (2002). Design and implementation of the idemix anonymouscredential system. In Proceedings of the 9th ACM conference on computer and communications se-curity (pp. 21–30). New York: ACM

14. Chaum, D. (1985). Security without identification: transaction systems to make big brother obsolete.Communications of the ACM, 28(10), 1030–1044

15. Chaum, D., & Evertse, J.-H. (1987). A secure and privacy-protecting protocol for transmittingpersonal information between organizations. In CRYPTO 86. Lecture Notes in Computer Science,vol. 263 (pp. 118–167). Heidelberg: Springer

16. Chaum, D.L. (1981). Untraceable electronic mail, return addresses, and digital pseudonyms. Commu-nications of the ACM, 24(2), 84–90

17. Chen, L. (1995). Access with pseudonyms. In E. Dawson and J. Golic (Eds.), Cryptography: policyand algorithms. Lecture Notes in Computer Science, vol. 1029 (pp. 232–243). Heidelberg: Springer

18. Chin, F.Y., & Ozsoyoglu, G. (1982). Auditing and inference control in statistical databases. IEEETransactions Software Engineering, 8(6), 113–139

19. Clayton, R., Danezis, G., & Kuhn, M.G. (2001). Real world patterns of failure in anonymity systems.In Information hiding: 4th international workshop, IHW 2001. Lecture Notes in Computer Science,vol. 2137 (pp. 230–244). Heidelberg: Springer

20. Cranor, L.F. (2002). The role of privacy advocates and data protection authorities in the design anddeployment of the platform for privacy preferences. In Proceedings of the 12th annual conference oncomputers, freedom and privacy, (pp. 1–8). New York: ACM

21. Cyber Dialogue (2001). Cyber dialogue survey reveals lost revenue for retailers due to widespreadconsumer privacy concerns. http://www.cyberdialogue.com/news/releases/2001/11-07-uco-retail.html

22. Damguård, I. (1990). Payment systems and credential mechanism with provable security against abuseby individuals. In CRYPTO 88. Lecture Notes in Computer Science, vol. 403 (pp. 328–335). Heidel-berg: Springer

23. Denning, D.E., & Denning, P.J. (1979). Data security. ACM Computing Surveys, 11(3), 227–24924. Dingledine, R., Mathewson, N., & Syverson, P. (2004). Tor: The second-generation onion router. In

Proceedings of the 13th USENIX security symposium (August)

Page 27: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

Privacy and e-commerce: a consumer-centric perspective 115

25. Elovici, Y., Shapira, B., & Maschiach, A. (2002). A new privacy model for hiding group interestswhile accessing the web. In Proceeding of the ACM workshop on privacy in the electronic society (pp.63–70). New York: ACM

26. European Parliament, Report on echelon. http://www.europarl.eu.int/tempcom/echelon/pdf/rapport_echelon_en.pdf

27. Forrester Research (1999). Post-web retail (September). http://www.forrester.com/28. Forrester Research (2001). Privacy concerns cost e-commerce $15 billion (September). http://www.

forrester.com/29. Foster, I., Kesselman, C., & Tuecke, S. (2001). The anatomy of the grid: Enabling scalable virtual

organizations. International Journal of High Performance Computing Applications, 15(3), 200–22230. Fried, C. (1968). Privacy [a moral analysis]. Yale Law Journal, 77, 475–49331. Gartner Research (2005). Increased phishing and online attacks cause dip in consumer confidence

(June). http://www.gartner.com/32. Gerstein, R.S. (1978). Intimacy and privacy. Ethics, 89, 76–8133. Goldschlag, D., Reed, M., & Syverson, P. (1999). Onion routing. Communications of the ACM, 42(2),

39–4134. Goldschlag, D.M., Reed, M.G., & Syverson, P.F. (1996). Hiding routing information. In Information

Hiding (pp. 137–150). Berlin/Heidelberg: Springer35. Goldwasser, S. (1997). Multi party computations: past and present. In Proceedings of the sixteenth

annual ACM symposium on principles of distributed computing (pp. 1–6). New York: ACM36. Habermas, J. (1989). The structural transformation of the public sphere: an inquiry into a category of

bourgeois society. Cambridge: MIT (translated by T. Burger)37. Harris Interactive (2002). First major post-9/11 privacy survey finds consumers demanding companies

do more to protect privacy. http://www.harrisinteractive.com/news/allnewsbydate.asp?NewsID=42938. IBM Global Services (1999). IBM multi-national consumer privacy survey. Conducted by Louis Har-

ris and Associates, Inc. http://www.ibm.com/services/files/privacy_survey_oct991.pdf39. Jupiter Research (2002). Seventy percent of us consumers worry about online privacy, but few take

protective action. http://www.jupiterresearch.com/xp/jmm/press/2002/pr_060302.html40. Kahn, D. (1996). The history of steganography. In Proceedings of the first international workshop on

information hiding, London, UK. (pp. 1–5). Berlin: Springer41. Katzenbeisser, S., & Petitcolas, F.A. (Eds.) (2000). Information hiding techniques for steganography

and digital watermarking. Norwood: Artech House42. Kewney, G. Wireless lamp posts take over the world! http://www.theregister.co.uk/content/69/34894.

html43. Lysyanskaya, A., Rivest, R.L., Sahai, A., & Wolf, S. (1999). Pseudonym systems. In Proceedings of

the sixth annual workshop on selected areas in cryptography (SAC’99). Lecture Notes in ComputerScience, vol. 1758. Heidelberg: Springer

44. McBurney, P., & Parsons, S. (2003). Posit spaces: a performative model of e-commerce. In Proceed-ings of the second international joint conference on autonomous agents and multiagent systems (pp.624–631). New York: ACM

45. Milberg, S.J., Burke, S.J., Smith, H.J., & Kallman, E.A. (1995). Values, personal information privacy,and regulatory approaches. Communications of the ACM, 38(12), 65–74

46. Moor, J.H. (1997). Towards a theory of privacy in the information age. ACM SIGCAS Computers andSociety, 27(3), 27–32

47. Moores, T.T., & Dhillon, G. (2003). Do privacy seals in e-commerce really work? Communicationsof the ACM, 46(12), 265–271

48. Murphy, R.F. (1984). Social distance and the veil. In Philosophical dimensions of privacy: an anthol-ogy (pp. 34–54). Cambridge: Cambridge University Press (chapter 2)

49. Nielsen, J., Molich, R., Snyder, C., & Farrell, S. (2000). E-commerce user experience. Technicalreport, Nielson Norman Group

50. Odlyzko, A. (2003). Privacy, economics, and price discrimination on the internet. In Proceedings ofthe 5th international conference on electronic commerce (pp. 355–366). New York: ACM

51. Patton, M.A., & Jøsang, A. (2004). Technologies for trust in electronic commerce. Electronic Com-merce Research, 4(1–2), 9–21

52. Posner, R.A. (1984). An economic theory of privacy. In Philosophical dimensions of privacy: ananthology (pp. 333–345). Cambridge: Cambridge University Press (chapter 15)

53. Privacy International. National id cards. http://www.privacy.org/pi/activities/idcard/54. Prosser, W.L. (1960). Privacy [a legal analysis]. Harvard Law Review, 48, 338–42355. Rachels, J. (1975). Why privacy is important. Philosophy and Public Affairs, 4(4), 323–333

Page 28: Privacy and e-commerce: a consumer-centric perspective and ecommerce.pdf · Privacy and e-commerce: a consumer-centric perspective ... Abstract Privacy is an ancient concept, ...

116 R. Smith, J. Shao

56. Reagle, J., & Cranor, L.F. (1999). The platform for privacy preferences. Communications of the ACM,42(2), 48–55

57. Reiss, S.P. (1984). Practical data-swapping: the first steps. ACM Transactions on Database Systems,9(1), 20–37

58. Reiter, M.K., & Rubin, A.D. (1998). Crowds: anonymity for web transactions. ACM Transactions onInformation System Security 1(1), 66–92

59. Reiter, M.K., & Rubin, A.D. (1999). Anonymous web transactions with crowds. Communications ofthe ACM, 42(2), 32–48

60. Rezgui, A., Ouzzani, M., Bouguettaya, A., & Medjahed, B. (2002). Preserving privacy in web ser-vices. In Proceedings of the fourth international workshop on Web information and data management(pp. 56–62). New York: ACM

61. Saxonhouse, A.W. (1983). Classical greek conceptions of public and private. In Public and private insocial life (pp. 363–384). New York: St. Martins (chapter 15)

62. Schlaeger, C., & Pernul, G. (2005). Authentication and authorisation infrastructures in b2c e-commerce. In Proceedings of the sixth international conference on electronic commerce and Webtechnologies (EC-Web’05). Lecture Notes in Computer Science. Heidelberg: Springer

63. Schlörer, J. (1975). Identification and retrieval of personal records from a statistical data bank. Meth-ods of Information in Medicine, 14(1), 7–13

64. Schlörer, J. (1977). Confidentiality and security in statistical data banks. In Proceedings of workshopon data documentation (pp. 101–123). Munich: Verlag Dokumentation

65. Schlörer, J. (1981). Security of statistical databases: multidimensional transformation. ACM Transac-tions on Database Systems, 6(1), 95–112

66. Schoeman, F. (1984). Privacy: philosophical dimensions of the literature. In Philosophical dimensionsof privacy: an anthology (pp. 1–33). Cambridge: Cambridge University Press (chapter 1)

67. Schoemen, F.D. (1984). Philosophical dimensions of privacy: an anthology. Cambridge: CambridgeUniversity Press

68. Shoshani, A. (1982). Statistical databases: characteristics, problems, and some solutions. In Proceed-ings of the eighth international conference on very large data bases. September 8–10, Mexico city,Mexico (pp. 208–222). San Francisco, CA: Morgan Kaufmann

69. Smith, R., & Shao, J. (2003). Preserving privacy when preference searching in e-commerce. InP. Samarati and P. Syverson (Eds.), Proceedings of the 2003 ACM workshop on privacy in the elec-tronic society (WPES’03) (pp. 101–110). New York: ACM

70. Syverson, P. (2003). The paradoxical value of privacy. In Proceedings of the 2nd annual workshop oneconomics and information security, WEIS 2003

71. Thomson, J.J. (1975). The right to privacy. Philosophy and Public Affairs, 4(4), 295–31472. Tuerkheimer, F.M. (1993). The underpinnings of privacy protection. Communications of the ACM,

36(8), 69–7373. US Office of Federal Statistical Policy and Standards (1978). Statistical policy working paper 2: report

on statistical disclosure and disclosure avoidance techniques74. Volokh, E. (2000). Personalization and privacy. Communications of the ACM, 43(8), 84–8875. Warren, S.D., & Brandeis, L.D. (1890). The right to privacy [the implicit made explicit]. Harvard Law

Review, 4(5), 193–22076. Wasserstrom, R.A. (1984). Privacy: some arguments and assumptions. In Philosophical dimensions

of privacy: an anthology (pp. 317–332). Cambridge: Cambridge University Press (chapter 14)77. Weintraub, J. (1997). The theory and politics of the public/private distinction. In Public and private

in thought and practise (pp. 1–42). Chicago: University of Chicago Press (chapter 1)78. Weintraub, J., & Kumar, K. (1997). Public and private in thought and practise. Chicago: University

of Chicago Press79. Westin, A.F. (1967). Privacy and freedom. New York: Atheneum80. Westin, A.F. (1984). The origins of modern claims to privacy. In Philosophical dimensions of privacy:

an anthology (pp. 56–74). Cambridge, UK: Cambridge University Press (chapter 3)81. Westin, A.F. (1991). Equifax-Harris consumer privacy survey. New York: Louis Harris & Associates82. Westin, A.F. (1994). Equifax-Harris consumer privacy survey. New York: Louis Harris & Associates83. Yates, J. (1769). Millar vs. Taylor. In 4 Burr. (pp. 2303–2379)