15 Minutes of Privacy

download 15 Minutes of Privacy

of 10

Transcript of 15 Minutes of Privacy

  • 7/28/2019 15 Minutes of Privacy

    1/10

    15 Minutes of Privacy

    Abstract

    This paper is a casual saunter through issues of privacy, trust and se-curity in the ubiquitous computing domain. The transition to ubquitousenvironments has created a problem regarding the type context informa-tion to monitor with or without a users consent. This paper presentswork that demonstrates ubiquitous monitoring effects human behaviourand human notions of privacy is context sensitive. The paper also showsprogress in transient trust mechanism in the ubiquitous environment.

    1 Introduction

    15 Minutes of Privacy, on first reading this option for a review as-signment the author assumed artistic pretension to modify such an iconicphrase from an iconic icon such as Warhol [Kaplang92a]. However, on fur-ther reflection the mobile distributed digitally credited packets dropped,of course!, are we on route to a world in which privacy is something ofa fantasy achievable by a few, and in the main fleeting; something thatonce achieved can disappear, rendering the person a has been with onlymemories and souvenirs of a life with privacy. Or a brief experience whena person can do, think, say, or act in anyway they wish within the confinesof their own space, without fear of redress from the dreaded multi-contextand location aware nanobot sensors liberally peppered and embedded inour house, our clothes, our cosmetics, our medicines, our food, in us! Inthe not to distant future will we visit a doctor because we have a sensorvirus causing congestion in our digestive tract sensor cluster? However

    tous instant and spontaneous invisible technology, coupled with progressin medical nanotechnology [Zhou05a], and miniaturisation [Ratner02a],

    brings 24hr monitoring and archiving of all human behaviour and habits,even emotions [Picard02a] a step closer to a practical reality. In such apotential future scenario, security and privacy or the lack of it, shall andmust be high on the agenda for discussion. This paper reviews some in-stances of ubiquitous technology and discusses privacy in the context ofsocial behaviour and trust, and touches on technical aspects of security intransient use of computers. Firstly, an extended introduction is providedto intoduce the notion of privacy, habits and behaviours.

    1

    fanciful a scenario this may seem, Weisers [Weiser91a] vision of ubiqui-

    R. Gill

    April 8, 2011

  • 7/28/2019 15 Minutes of Privacy

    2/10

    2 Privacy, Habits and Behaviour - an ex-

    tended introductionWhat is privacy?, we often hear the phrases such as an invasion ofprivacy, in the privacey of ones own home. The dictionary [OED10a]definition of privacy is:

    The state or condition of being alone, undisturbed, or free frompublic attention, as a matter of choice or right; seclusion; free-dom from interference or intrusion.

    The definition supports the notion of privacy being some kind of invis-ible force field in place to ward off attacks from intruders, in which there isa choice of who may penetrate and who may not. Starkly missing from thedefinition is the word individual, is this implicit or intentionally leftout of the definition. If implicit, then how can an individual be fully awareof any monitoring or survielance mechanisms that might be in place? Ifintentionally left out, then, privacy cannot be regarded as an individualchoice, rather a choice made on our behalf. Historically, great emphasison privacy has been a commodity attainable by the rich and privileged,or viewed as an eccentric behaviour that can only be indulged in if youhave the money to do so. Conspiracy theorists and the powers that besimply turn a blind eye, while continuously monitoring events. Whenmonitored behaviour becomes untenable, the viel of privacy is publiclylifted. Lifting of these viels makes for inquisitive feeding frenzies of thegeneral public, eager to penetrate the privacy of celebrities through mediacoverage. What is it that make use want to know the habits of celebrities,is it that day to day habitual behaviour breaks down our feeling of unwor-thiness buy knowing that celebrities as really just as normal as anyone.

    Such divulgences of personal habits can also be positive. It was reportedthat the her majesty Queen Elisabeth the II of England, habitually eatsbreakfast served from tupperware tubs [Tweedie03a]. Rather than beshocked that the Queen of England does not eat from ornate gold platedtableware, the public were simply warmed, knowing that although beinga Queen, she is human like all of us, and increased her once plummitingpopularity, as well as share price of Tupperware whose brand image wentfrom fancy plastic food containers, to food container with royal approval.Is eating from a tupperware tub a behavour or a habit?, both of whichare changeable over time. For such an article to be released in a reputablebroadsheet, was not because of tupperware but because the intrusion ofprivacy was regarded as a breach of security on a national level, and a fullenquirey ensued. However, it was the Tupperware habit that consumedpeoples interest.

    Ubiquitous computing professionals described here as UbiComers of-ten sidestep the thorny issue of privacy by replacing the word privacy withhabits instead of personal habits, and behaviours instead of personalbehaviour, when reporting the latest ubiquitous working application.

    Ubiquitous computing technology is essentially centered around smar-tifying everyday objects and how people interact with these objects tocarry out habits and behaviour, as individuals and in groups. As with

    2

  • 7/28/2019 15 Minutes of Privacy

    3/10

    most technology, ubquitous computing can be used for purposes otherthan the original intended one. Still in its infancy, ubiquitous comput-ing is wild with excitment about the advances being made. As yet noreal incident has occurred that we know of, in which a breach of privacyfrom ubqiutous computing services has led to any kind of lawsuit or courtaction.

    However, just as concerns have grown regarding privacy in social net-working sites, it is just a matter of time that the same concerns are raisedfor ubiquitous computing on a global scale, once it becomes a mainstreamtechnology. Social networking sites such as Facebook [Facebook11a] andglobal information providers such as Google [Google11a], accrue vast rev-enue through ownership of information, which is distilled to specificationand sold on. Just as a pimp hands information of potential johns tohis workers, who then accidentally catch eye of the listed, in the samefashion as models draped over the petrol tank of a new Harley Davidson

    at a motor show, or the tabloid newspaper editor who carefully chooseswords and phrases they themselves would never use, for the next days frontpage headline, they know their demograph. If this is abuse then surelythis is abuse of information rather than of privacy, information pertain-ing to demography rather than the individual. It is the accumulation ofindividual private information then, that is the real concern. Social net-working sites unknowingly furnish personal individual information, to thewould-be stalker, paedophile or identity thief. Will ubiquitous comput-ing unknowingly furnish personal habits and behaviours to the would-befundamentalist, dictator, blackmailer? One major difference between theprivacy issues of online social networks and ubiquitous computing is thatin ubiquitous computing the user does not need to be online, or evenhave an internet connection. The sure way of not providing informationonline is simply not to take part, not to go online and not to provideinformation, we have the choice to turn off the computer, modem, ormobile device. This poses a dilemma because ubiquitous and pervasivecomputing by vision, is non intrusive, invisible and everywhere, howeverto function and provide context aware personalised services, the technol-ogy demands intrusion in the form of monitoring, surveillance and accessto personal information, patterns of behaviour and habit [Jonsson06a],[Cas05a], [Lyytinen02a],[Koskela03a].

    Once mainstream, ubiquitous technology will generate a terabyte moun-tain range of personal private information, far surpassing granularity ofdetailed demographics, national sensors and social network profiles, muchvalued by governmental taxation departments, and corporate advertisingexecutives alike. For ubiquitous technology to reach its full potential, itrequires personal individual information and has the potential to record,

    track, and predict with a certain probability, an individuals habits andbehaviours, in order to provide a service to the user. The issue of own-ership and access to this information can often be overshadowed by thewhite heat of ubiquitous technology. Authors such as Bell and Dourish[Bell07a] assert that ubiquitous computing is already upon us. Add tothis Wiesers [Wieser91a] assertion that technology will itself become in-visible, available anytime and anywhere, the need for cast iron, or in thiscase tungsten carbide measures and guidelines for privacy and security

    3

  • 7/28/2019 15 Minutes of Privacy

    4/10

    measures regulating this flow of information lifeblood of ubiquitous com-puting are necessary. Although some initiatives on guidelines for privacyin ubiquitous computing are underway [Lahlou05a], privacy issues are nottaken seriously by Ubicomers who seem to regard privacy as secondaryto project deliverables and design requirements [Lahlou05b]. One onlyneeds note the scarcity of serious literature underpinning the inequalityof privacy legislation compared to advances in technical achievement inubiquitous computing.

    3 Ubiquitous surveillance and its effect

    on human social behaviour

    Surveillance and monitoring in the ubiquitous domain differs from tradi-tional data acquisition systems in five basic ways.

    1. Scale of Data Collection - Monitoring has a wider range of areasand objects that can be monitored such as houses, offices, fridges, humans,ovens, almost anything that can carry or have an embedded networkedsensor. 2. Manner of Collection - We are unaware that monitoring is tak-ing place. 3. Type of Data Collected - Anything that can be transcribedinto a digital signature can be collected accurately, from location, move-ment, environmental changes, even emotion. 4. Motivation to collect Data- All and any data in digital form is analysable and considered valuable.5. Accessibility of Data - Ubiquitous surveillance demands collection oflarge volumes of data to provide context aware services to users.

    Providing the surveyed user privacy options that limit any of the fivepoints listed above only fuels the conflict between principles of protectingprivacy, and the main thrust of ubiquitous computing. On the one hand,

    ubiquitous computing requires access to any and all collectable data inorder to provide context aware services, and on the other hand; it ispractically difficult to inform the user of what data is being collected (asthis is dynamic) and fundamentally goes against the ubiquitous mantraof being invisible and unobtrusive in every day behaviour.

    The work of Jonsson [Jonsson06a], argues that by embedding surveil-lance technology in the physical environment, the technology and cues ofsurveillance become literally and virtually concealed from the user. Real-istically, the user is or should be aware that surveillance is ongoing, andthe only realistic attitude of human beings living in such environmentsis to assume that any activity or inactivity is being monitored, analyzed,transferred, stored and maybe used in any context in the future [Lyyti-nen02a] .

    By doing so, direct reminders of surveillance are embedded together

    with the technology, creating an embedded panopticon. Here the panop-ticon is used as a metaphor for surveillance, because the environmentaldescription is comparable to a panoptic society [Koskela03a]. Originally,a design to illicit self control of prisoners, the panopticon was an archi-tectural design of a circular prison complex, in which all prison cell doors(with bars) faced a central observation tower. The prisoners did not knowexactly when they were being watched or who was watching them, so they

    4

  • 7/28/2019 15 Minutes of Privacy

    5/10

    assumed they were being watched all the time. The omnipresence of thecentral observation tower reminded prisoners of the possibility of beingwatched at anytime. The panoptic effect enforced a self control over pris-oners visible behaviour. In the context of ubiquitous computing mon-itoring, applying the panoptic metaphor for surveillance is not straightforeword because there is no omnipresent central reminder of monitoringtaking place. Behaviour in ubiquitous panoptic monitoring also differs,in that resistance of the monitored in the original prison environmentwas replaced by an accepting resolve; however in a ubiquitous comput-ing environment resistance behaviour takes place. Such resistance takesthe form of the monitored tampering with the monitoring infrastructureto determine when monitoring takes place, instead of accepting constantmonitoring as a collective group. Comparisons with a group of incarcer-ated prisoners that society does not trust, and humans free to roam andact as they wish, has obvious differences. However, the panoptic has been

    shown to be a powerful influence on human behaviour and to a certainextent on self control over human behaviour, incarcerated or not.

    Ubiquitous monitoring of user location and location context awarenessis high on the ubiquitous agenda because it is no longer driven by the usercarrying a mobile device, but the ubquitous embedded environment theuser finds themselves in. A seminal work of location awareness by Adelsee[Adlesee01a] is typical of the kind of ubiquitous monitoring system cur-rently implemented, but goes further to model real world environments,metaphorically the application as a personal assistant and diary. Adelsee[Adlesee01a] introduces the notion of a sentient infrastructure that mod-els a real indoor environment in real time, based on sensor information ofuser location and personal preferences. Moreover, the modelled environ-ment personalises devices and applications within the real environment,thereby customising each device and application. Sentient computing isdescribed as a method of managing mobile devices to suite the contextof use; such management includes configuring devices to user needs. Thesentient system updates software objects based on information receivedabout what state and location the real world object is in. Sensors emitan ultrasonic pulse to fixed location receivers in the environment; the lo-cation is calculated by triangulation of time of flight. Sensor location isdetermined by synchronising sensors in a wireless cellular network, andalso receivers are reset by base stations which communicate with sensorover wireless, and reset receivers over fixed wire within the real infras-tructure. The system uses 200 sensors, 3 wireless cells and 750 receivers.95sensor location has accuracy within 3 cm, each base station can address3 sensors simultaneously, within each radio cell 150 updates per second.This kind of accuracy was a majour step in these kinds of location aware

    systems, far more accurate than previous badge based systems [Want02a].Base stations run scheduling algorithms for sensors to reduce power con-sumption in times of non scheduled use, in so doing the expected qualityof service for each sensor varied. When the sensor is actively in use by theuser, by pressing a button, Over the Air (OTA) message to base stationsautomatically start the location procedure. The OTA message containssensor Unique Identifiers (UID), and while within base station vicinity, asensor uses a temporary ID provided by the base station. When sensor

    5

  • 7/28/2019 15 Minutes of Privacy

    6/10

  • 7/28/2019 15 Minutes of Privacy

    7/10

    with-friends (CSWF), and variable-privacy (VP). The main findings ofstudy conclude that privacy is both a dynamic social process with aspectsof something dichotomous and statelike, similar to the CP and VP privacysharer type behaviour.

    4 In silicon we trust

    Trust plays a vital role in relationships, providing personal informationof any kind is a transaction based on trust in some form. The interac-tion between man and machine in the age of information technology hasformed an unconscious insidious trust, that makes us divulge personal in-formation to a machine, far more readily than we would to another humanbeing. We are more comfortable punching our credit card numbers intoa hand held device in a restaurant or shopping arcade, than allowing a

    human being to write it down and pass it on the a bank. Questioning theintegrity of the human is acceptable, however it is quite normal behaviournot to question the integrity or security of the machine itself. If the hu-man was indeed interested in your credit card details, they would simplypresent a false rogue machine that records your details, confident thatthe machine will be trusted to be secure. A well researched vision of apotential ubiquitous computing future is presented by the movie MinorityReport [Spielberg02a], in which trust is based on identification of a hu-man through retinal scanning, both in public and private environments.In public environments such as a shopping arcade intelligent advertis-ing screens identify and interact with humans to provide context awarepersonalised information designed to make us buy consumer goods. Inthe film, resistance to panoptic monitoring is demonstrated by the leadcharacter who circumvents the monitoring system to change his identity,

    comically the ubiquitous panoptic monitoring system displays a flaw, inbeing fooled into identifying an obvious western male human as a far east-ern male human, something that a human monitoring system would mostlikely query. In the private environment, a more sinister and highly intru-sive identification method is employed. A gang of mechanical intelligentnetworked robots (called spiders) act as law enforcement agents, autho-rised to access all areas of private dwellings and perform retinal scans ofany and all humans therein. The humans being scanned by the spidersexhibit a complete resolve to high impact intrusive monitoring, perhapsidentifying the spiders with authority or perhaps simply indoctrinatedover time into forgoing any right to privacy, the film does not make clearwhich, and is left to the viewer to ponder. Alternatively, one could ar-gue that the panoptic subterfuge has indoctrinated the human perceptionof the spider robot retinal scan as unconscious events, without memory.Effectively, pushing the experience of retinal scanning so deep into thesubconscious, that it becomes a non event, cloaking the spiders for whatthey represent. Cloaking to such an extent the spiders and the technologybehind them becomes invisible.

    Allowing agents to make trust judgement decisions on our behalf ispeculiar to ubiquitous computing, because of the rate of trust judgementsrequired for ubiquitous computing to monitor large volumes of diverse

    7

  • 7/28/2019 15 Minutes of Privacy

    8/10

    contextual data at any one time [Sillence08a]. This does not suggesthuman are comfortable with not being in the trust judgement processall together. The work of Roussos [Roussos04a] studied a ubiquitous e-commerce environment and found that an end user is more comfortableusing the system if they feel they can anonymously intervene. Anonymity( the cloaking of identity),

    5 Security or Secure Entity

    In traditional distributed systems, the notion of trust between entitiessuch as humans and machines is fundamental in-interdomain authentica-tion protocols. The work of Yahalom [Yahalom93a] analysed and com-pared trust relationships between entities of known authentication proto-cols in distributed systems. The entity itself was not important, rather

    the nature of operations that the entity attempts or performs. A se-cure system is deemed as one in which control is exercised over whichentity can perform which operation. Before trust is established betweenentities either local or interdomain and operations carried out, certaincriteria must be satisfied. Typical criteria to be satisfied is unique identi-fication, unique but common message manipulation and unique acknowl-edgements, between entities. Normally, such criteria is resolved usingcryptographic software, electronic signatures and secure certicates on au-thentication servers. To establish trust then, on a traditional distributedsystem between entities an existing trusted entity is required to beginwith.

    In a pervasive and ubiquitous computing environment, in which useof an entity such as a desktop computer is transient, a user has no idea ifthe entity is corrupt or untrustworthy. Internet Suspend/Resume (ISR)

    [Satyanaranyanan07a] is a mobile computing model based on the conceptof a user never having to save any files or profiles on any entity hardrive,instead all the users files, applications desktop settings etc are locatedon a safe server somewhere, and downloaded onto an entity as a virtualdrive whenever the user requires. Once finished, the updated virtual PCstate transports the changes back to the safe server. The virtual desktopis not closed down, simply suspended and restored to the last checkpointwhen required. To resolve the problem of quickly establishing trust onan unknown shared entity, as a transient user, a tool called Trust Snif-fer [Satyanaranyanan07b] incrementally establishes trust, with the entity.The concept behind Trust Sniffer is that a user carries around their ownmini operating system on a memory stick. With use of ISR in mind,when connected to a PC entity the Trust Sniffer only validates (estab-lishes trust) with software on the entity the user requires to carry out anoperation on. The Trust Sniffer uses its own mini linux kernel that bootscompletely seperately from the PC entity OS. On system boot, a trust ex-tender kernel module uses Integrity Measurement Architecture (IMA) forLinux SHA-1 (sha1sum) hash measurements of any executable code, andchecks the IMA measurements with trusted IMA aggreagated measure-ments stored in the its Trusted Platform Module. If IMA measurementsof executable code and TPM measurement list mismatch, the user is note-

    8

  • 7/28/2019 15 Minutes of Privacy

    9/10

    fied by a trust alerter. Here any untrusted code is detected before it canbe loaded. The Trust Sniffer gradually trust validates each componentof the PC entities boot OS, before the PC entity OS is allowed to fullyboot. During this process of establishing a root trust, Trust Sniffer loadsvalidated IMS measurement list on the PC entity OS, which the PC entitythen uses to validate other code. Once root trust is established, it can beextended to PC entity applications or simply to interact with the usersISR virtual machine. In effect, Trust Sniffer is a local manual techique toresolve initial trust criteria without the need to trust an untrusted entityto start the trust validation process, prior to inter-domain communicationbetween the PC-entity and ISR.

    6 Conclusion

    The gathering and monitoring of contextual sensor information is an in-trinsic part of ubiquitous computing. Technical challanges to acquireand gain acces to such detailed information as location, juxstaposition,habits behaviours and emotions are all but addressed. In addressing thesechallanges, ubiquitous computing has inadvertantly created a fledgling aponopticesch society, in which ubiquitous panoptic monitoring affectshuman behaviour. Ubiquitous computing poses unique challanges to ad-dress issues of security, user privacy and access to contextual user inforam-tion. The single challange which alludes UbiComers is to find a balancebetween unobtrusive services and services that make trust and privacydecisions on behalf of the user.

    References

    [Adlesee01a] M. Adlesee, R. Curwen, S. Hodges, P. Steggles, A. Ward, A.Hopper, Implementing a Sentient Computing System, In Computer, vol34, No 8, pp. 50-56 August 200

    [Bell07a] G. Bell, P, Dourish,Yesterdays tomorrows: notes on ubiquitouscomputings dominant vision, In Journal of Personal and UbiquitousComputing, vol 11, Issue 2, January 2007

    [Cas05a] J. Cas, Privacy in pervasive computing environments: A contra-diction in terms , In IEEE Technology and Society Magazine, pp. 24-332005

    [Consolvo05a] S. Consolvo, Location Disclosure to Social Relations: Why,When and What People Want to Share, In Proceedings of SIGHI Con-ference on Human Factors, ACM Press pp. 81-90 2005

    [Facebook11a] Facebook.com 2011[Google11a] Google.com 201

    [Jonsson06a] K. Jonsson, The Embedded Panopticon: Visibility Issues ofRemote Diagnostics Surveillance , In Scandinavian Journal of Informa-tion Systems, vol 18, Issue 2 pp. 7-28 2006

    [Kaplang92a] J. Kaplang, Warhol photo exhibition, Stockholm, 1968:, InBartletts Familiar Quotations, 16th Edition, p. 758 1992

    9

  • 7/28/2019 15 Minutes of Privacy

    10/10

    [Koskela03a] H. Koskela, Cam Era: The contemporary urban Panopti-con , In Surveillance and Society, vol 1, Issue 3, pp. 292-313 2003

    [Anthony07a] D. Anthony, D. Kotz, T. Henderson Privacy in Location-Aware Computing Environments, In IEEE Pervasive Computing, vol 6,Issue 4, pp. 64-72 2003

    [Lahlou05a] S. Lahlou, F. Jegou, European Disappearing Computer Pri-vacy Design Guidelines V1.0. Ambient Agoras Report D15.4., The Dis-appearing Computer Initiative October 2003

    S. Lahlou, M. Langheinrich, C. Rocker Privacy and Trust Issues withInvisible Computers, In Communications of the ACM, vol 48, No 3, pp.37-42 March 2002

    [Lahlou05b] [Lyytinen02a] K. Lyytinen, Issues and challenges in ubiquitouscomputing , In Communications of the ACM , vol 45, No 12, pp. 63-652002

    [Spielberg02a] S. Spielberg, Minority Report, A Steven Spielberg Film2002

    [OED10a] The Oxford English Dictionary, Oxford University Press, 2010

    [Picard02a] R.W. Picard, J. Klien, Computers that recognise and respondto user emotion: theoretical and practical implications , In Interactingwith Computers, Elsevier, pp. 141-169 February 2002

    [Ratner02a] M. Ratner, D. Ratner, Nanotechnology: a gentle introductionto the next big idea, Published by Prentice Hall Press, ISBN:0-13-101400-5 2002

    [Roussos04a] G. Roussos, T. Moussouri, Consumer perceptions of privacy,security and trust in ubiquitous commerce, In Personal and UbiquitousComputing, vol 8, No 6 pp. 416-429 2004

    [Sillence08a] E. Sillence, P. Briggs, Ubiquitous Computing Trust Issues for

    a Healthy Society, In Social Science Computer Review, vol 26, No 1pp. 6-12 February 2008

    [Satyanaranyanan07a] M. Satyanaranyanan, Pervasive Personal Comput-ing in an Internet Suspend/ Resume System,, In IEEE Internet Com-puting, vol 11, No 2 pp. 16-25 2007

    [Satyanaranyanan07b] M. Satyanaranyanan, Rapid Trust Establishmentfor Pervasive Personal Computing,, In IEEE IComputer Society, vol 6,No 4 pp. 24-30 2007

    [Tweedie03a] M. Tweedie,Footman exposes Tupperware secret of theQueens table, The Telegraph Newspaper, England 20th November 2003

    [Want02a] R. Want, The Active Badge Location System, In ACM Trans-actions, Information Systems, pp. 91-102 1992

    America, September 1991[Yahalom93a] R. Yahalom, B. Klein, T. Beth, Trust Relationships in Secure

    Systems - A Distributed Authentication Perspective, In Proceedings ofIEEE Symposium on Research in Security and Privacy, pp. 150-164 1993

    [Zhou05a] T. Zhou, L. Chen, K. Aihara, Molecular Communicationthrough Stochastic Synchronization Induced by Extracellular Fluctua-tions, In Journal of Physical Review Letters, vol 95, Issue 17 2005

    10

    [Weiser91a] M. Weiser, The Computer for the 21st Century, In Scientific