Journal of Network Theory
-
Upload
adam-c-passarella -
Category
Documents
-
view
24 -
download
0
description
Transcript of Journal of Network Theory
WITH CONTRIBUTIONS FROM:RYANNE TURENHOUT / ALEX GEKKER / KALINA DANCHEVA LINDSY SZILVÁSI / BOB VAN DE VELDE / MAURICE DE HAAN
© 15 April 2011 www.networktheory.nl
This is a special issue of the academic journal The Journal of Network Theory, part of the Master programme New Media & Digital Culture at the University of Utrecht.
Chief editor: Marianne van den Boomen
Guest editors: Alex Gekker, Kalina Dancheva, Lindsy Szilvási, Ryanne Turenhout, Maurice de Haan and Bob van de Velde
Front page global graphic by 5milli (Deviantart)
Contributers: Alex Gekker, Kalina Dancheva, Lindsy Szilvási, Ryanne Turenhout, Maurice de Haan and Bob van de Velde
contents
4Regulating user-generated data on Facebook.Ryanne Turenhout
15Bob van de VeldeThis network thing called ‘the Web’
26Citizens of web 2.0: public sphere as cultural publicLindsy Szilvási
Alex GekkerLegionnaires of Chaos “Anon” and Governments3751Cookies and the mindset of control
Kalina Dancheva
63Protocol - Alexander GallowayReviewed by Ryanne Turenhout
66Reviewed by Alex GekkerTransmetropolitan - Warren Ellis
69Code version 2.0 - Lawerence LessigReviewed by Kalina Dancheva
Reviewed by Lindsy SzilvásiBastard Culture! - Mirko Tobias Schaefer
76Free style - The ListenersBob van de Velde
72
This article situates itself within the free labour discourse (Scholz 2008; Terranova
2000) and investigates in what ways the business model of Facebook is inscribed
into the technical design. Hereby revealing the ways in which user generated value
is facilitated on Facebook. This article provides a nuanced look at the underlying
mechanisms of Facebook and the various dynamics between the business model,
users, platform owners and technical design. Keeping in mind the business model
this article will take a descriptive approach and first explore the ways in which the
user activities are regulated by means of the front-end technical design, the graphi-
cal user interface. Secondly the intelligence of the back-end will be explored, as this
is also an important and often neglected part of facilitating value, this will be evalu-
ated in terms of algorithm, data-retention, data mining, data-aggregation and the
open graph protocol. Even though these two categories (front- and back-end) are
used, it will become clear that this distinction is not always so easily made. This will
lead up to the main argument that user activities are regulated in ways that have
are invisible, which constitutes to the problematic aspect of the users having no
control or insight into what happens with the user-generated data. Hereby it is not
so much about the questions of personal privacy as these are often varying accord-
ing to the cultural context. It is the power that is gained by the platform owners and
the lack of control by the users that this constitutes.
Keywords: Facebook, regulation, control, user-generated data, technical design
4
Abstract
Facebook with currently more than 500 million users is one of the many examples
of technology become increasingly important in modern life. Additionally it has
a valuable database filled with data of the users (Vogelstein 2007) it is safe to say
that the database of Facebook has become even bigger by now, this brings with it
questions of privacy, commodification of user-generated data and as argued by
Mark Andrejevic “[t]he question is no longer just what information companies will
collect about us, but how this information will be put to use, the ability to trans-
fer it, sell it, conduct marketing experiments with it and base advertising appeals
upon it” (2009). Andrejevic expressed his concern for the way that user gener-
ated data is used, this article will built upon this notion and questions the ways in
which control and regulation of user-generated data finds its way into the technical
design of Facebook and how this constitutes to the exploitation of this data. This
in line what Bernhard Rieder argued, “[p]ower structures are not confined to the
social realm; they also operate inside of technical artifacts and to decipher them,
we need to look at these artifacts themselves” (2005, 27). Therefore situating itself
within the free-labour discourse, this is seen here as an emerging critique of the
web 2.0 applications. Part of the critique within this discourse lies in the way that
the labour of the web 2.0 platform users is used by the platform owners to profit or
benefit from without compensating or acknowledging the immaterial labour pro-
vided by the users (Scholz 2010, Terranova 2004). The mechanisms of control of
the user-generated data will be discussed in terms of the front-end and back-end
intelligence, which is explored in terms of algorithm, data mining, data aggrega-
tion, and data-retention. The aim of this article therefore is not to ask whether the
users care that their activities are being exploited, the aim is in bringing forth the
ways in which user-generated value is facilitated by means of the technical design
and the lack of control that the users have over the way that their data is used. As
these are an often-neglected parts of the research done in the free labour discourse.
Blurring lines between front-end and back-end algorithmic control
On the level of the front-end graphical user interface the user activities are con-
tinuously being steered. On Facebook participation is built-in as part of the de-
fault design feature, the business model is inscribed into the design and shapes
the user activities. As is shown in the recent discoveries made about the news feed
Regulating user-generated data on Facebook
5
Ryanne Turenhout
algorithm EdgeRank (Kincaid 2010). Here the line between the front-end and the
back-end is becoming unclear. Regulation of user activities finds its way by which
the algorithm decides which status update to show in the users news feed. Hereby
links, videos and photos are given preferential treatment over normal status up-
dates, which means that they are more likely to show up in the news feed (Weber
2010). This process can be seen as sociodigitization, as coined by Robert Latham
and Saskia Sasse. Which means that Facebook can than be seen as “an example
for the explicit (socio-)digitization of social relations that were mediated quite dif-
ferently in the past […] The ‘network of family, friends, neighbors and colleagues’
[are] now recreated inside of the system” (Rieder 2007). The news-feed algorithm
becomes the social filter for the users of Facebook, “when the visibly of an opinion
becomes a question of algorithms? Meaning is deeply embedded in the non-discur-
sive - in software itself” (Rieder and Schäfer 2008, 3).
Not only is the algorithm a social filter but also is an important aspect of the back-
end of Facebook in the form of data mining of vast amount of user- generated
data that are stored on the servers. Data mining is the process of exploration and
analysis of a dataset, which is usually of a considerable size. This process is done in
order to discover patterns, to extract relevant knowledge or to obtain meaningful
recurring rules. When it comes to Facebook and their business model, data min-
ing is used to make sense of user-generated data that is stored on the servers, to
filter out the users interests, patterns so that the advertisements can be specified
to the users interests. Some interesting developments have been made regarding
the aspect of data mining. Two examples will be explored here. First, an experi-
ment currently employed by Facebook, and secondly a recently implemented new
advertisement system.
With a focus group of proximally six million people, Facebook is currently experi-
menting with the data mining of real time conversations and status updates for ad-
vertisement purposes. Delivering targeted advertisements based on status updates
is not something new but it has until now never been done on a real time basis.
The algorithm behind this is continuously being adjusted and the advertisements
targets audiences based on data collected over longer periods of time, this closely
relates to the data retention as explained earlier (Slutsky 2011). This goes to show
that “[s]oftware is responsible for extending, both quantitatively and qualitatively,
the role that technology plays in the everyday practices that make up modern life”
6
(Rieder and Schäfer 2008, 3). This service allows for the displaying of advertise-
ments that are even more customized and specified to the users’ interest. The al-
gorithm filters, structures, interpret, and visualizes information in an automatic
fashion (Rieder and Schäfer 2008, 2). Regulation of the users activities and their
data here is in the form of the tailored advertisements, which are optimized for a
maximum click-through rate.
Another recent example of the further development of the advertisement algo-
rithm based upon the users posts is the ‘sponsored stories’ in which the algorithm
pulls content out of the status update and uses this to place advertisements on
their friends’ Facebook pages. This feature is designed to built brand buzz and
works based upon the likes and check-ins by the users when they visit restaurants,
websites, events, products and so forth. The data generated by users is hereby di-
rectly turned into advertisements and displayed on the pages of their friends (Se-
gall 2011). An interesting development here is that it is not the advertiser that is
controlling the content, it is about the users actions and therefore also their labour.
Now even by posting a status update, in real-time the information and advertise-
ment infrastructure is improved. It should be noted however, that these actions can
also be turned against the system, by strategically placing a status update about the
need for having a pizza while not actually being hungry and by this the user can get
for instance a discount coupon. But this only plays into the hands of the advertisers
and companies and increasing the click-through and conversion rate of the adver-
tisements. This example shows the invisibility of the way that the business model is
implemented into the front-end technical design, the graphical user interface. The
advertisements don’t even look like advertisements anymore; they are now basi-
cally just text links into the status update itself.
Back-end: data retention
On the level of the back-end, data retention plays an important role. Facebook has
everything to gain by keeping users locked-in in a technical manner and keeping
control over the user-generated data. The more data is stored, the more specific ad-
vertisements can be shown to users. Showing specific and tailored advertisements
that fit right into the users interests and needs an important part of the business
model, it is about the discovery of “combinations of past behaviour, location, de-
mographics, and temperament, that make individuals more likely to be influenced
by a finely-pitched marketing appeal” (Andrejevic 2009).
7
In a recent blogpost Mark Zuckerberg provided some insight into the data reten-
tion span. This blogpost said that a new feature has been implemented into Face-
book, the ability to download everything you have every posted, including pictures,
wall posts and so forth (Zuckerberg 2010). This is furthermore reflected in the pri-
vacy policy of Facebook “[w]e save your profile information (connections, photos,
etc.) in case you later decide to reactivate your account.” Additionally, in the latest
version of the terms of services it says “[h]owever, you understand that removed
content may persist in backup copies for a reasonable period of time.“ What hap-
pens with this data in the backup copies is not clear. Problematic about this is that
the user-generated data entrusted into the hands of a private company and for
which there is no certainty what the data is used for and who has access to this vast
amount of data available on the servers. Which is something that Meijas found
problematic as well, as was outlined in her article in which she argued that the ‘so-
cial’ is becoming part of the corporately owned platforms, and that a discussion of
the commodification of the social life and privatization of public space is missing.
This private ownership shapes the dynamics on the platform and enables “both the
creation of new social spaces and the controlling and monitoring of these spaces
through mechanisms facilitated by the architecture of the network itself” (Meijas
2009, 606). How much control the user really has over what happens in the back-
end, the private ownership of their data or the option and certainty that everything
is indeed deleted from the servers of Facebook is something that remains under-
valued in the public discourse.
Back-end: data aggregation and data flow
Closely related to the data-retention is the way in which value is created through the
aggregation of a large amount of people and their data into a single place (Scholz
2010). Aggregation here means the collection of data from the users, whether it
is movies that they have liked when visiting other sites or a recipe that the users
has liked. It is about compiling a complete profile of the users, their likes, dislikes,
interests, political preferences, and so forth, this does not necessarily have to be
confined to the Facebook site but data can also be aggregated from other sites,
“[t]he very density and intensity of our network interaction can be transformed
into profitable spreadsheets. […] This all leads up to an expressive data portrait of
each of the hundreds of millions of users of Facebook” (Scholz 2010). This is dem-
onstrated with the Open Graph Application Programming Interface (Open Graph
8
API) , which was introduced in 2010. With this API, any website can have the same
functionalities as a Facebook page, which means that the users can like the website,
or items on the website. To give an example, on the website imdb.com, if the user
goes to a page for a movie, a Facebook like button can be seen on the right sidebar.
If the users clicks on this button the movie will show up on the Facebook profile
page under ‘movies’ and a wall post has been made that the user likes this movie.
This way the user-generated data is transferred from one site to another. This is an
example of the many aspects of the Open Graph API, it also constitutes the custom-
ization of third party websites to the interests and Facebook profile information of
the users. By providing an easy-to-use API for website owners, Facebook is making
it even easier to aggregate as much data as possible about the users.
This is something what Andrejevic calls ‘virtual digital enclosure’, every virtual
move has the potentiality to leave behind a trace or record of itself. “When we surf
the Internet, browsers can gather information about the paths we take – the sites
we’ve visited and the click streams that take us from one site to the next” (An-
drejevic 2009). When products or items are purchased online, detailed accounts of
the transactions are left. “Entry into the digital enclosure brings with it the condi-
tion of surveillance or monitoring” (Andrejevic 2009). As is also the case with the
Open Graph API, if a user is not logged out of Facebook and goes to another site,
which has the Open Graph API installed their movements are or could be being
tracked. It is the “creation of an interactive realm wherein every action, interac-
tion, and transaction generates information about itself” (Andrejevic 2009). This
leads to the next problematic issue of control, the flow of user-generated data and
the lack of control over this flow by the users themselves. As Bernhard Rieder de-
scribed in his article ‘Networked Control: Search Engines and the Symmetry of
Confidence’, search engines can no longer be seen as black-boxes because this “im-
plies that we still have a clear picture of the outside shape of the object; there still
is an object and we know where it starts and where it ends, we can clearly identify
input and output.” Rather, they can be characterized as a black-foam. It is argued
here that this can be said about Facebook as well. This in the sense that the back-
end of the platform, the lack of control by the users finds it way in the fact that it is
not clear where the application ends and where the user-generated data flows and
who is able exploit this data.
The element of control on the side of Facebook further shows itself in the problem-
9
atic aspect that the users often cannot opt-out. The developments are forced upon
the users without giving them an option to turn the new forms of advertisements
off or control the way in which their data is used. The politics of Facebook are
hereby implemented into the technical design of the privacy settings, where there
is no option to opt-out of the advertisement models of new developments. Which
is something that has been said about other developments made on Facebook. For
instance the heavily critiqued launch of the Beacon program in 2007. In which
products that were bought by users on the partner sites (of Facebook) were auto-
matically shared with their friends on Facebook, a feature that was implemented
without notifying the users and without the ability to opt-out (Singel 2008). After
much critique this program was discontinued but it does goes to show that Face-
book can implement a feature at will and has control over the user-generated data
and data flow.
Going beyond the privacy issues
On Facebook, the free services that are being consumed by the users are only free
on the surface. The labour that is provide, this also includes the instalment of the
Open Graph API by webmasters, only constitutes to improving the infrastructure
and advertisement of Facebook. Furthermore, the data entrusted upon the plat-
form is completely in the hands of the platform owners. Hereby it is not so much
about the questions of personal privacy as these are often varying according to the
cultural context. It is the power that is gained by the platform owners and the lack
of control by the users that this constitutes (Andrejevic 2002, 232). Furthermore it
is the control of the information flow by the platform owners and the lack of control
that the users have over this data flow.
As outlined in this article, the technology plays an increasingly more important
role in the regulation of user-generated data. The policies and public discourse are
implemented into the algorithm, APIs, data retention, and the way in which user-
generated data is aggregated. And as was shown in the ‘sponsored stories’ example,
the way that advertisements are shown in the technical design of the status update
occurs such a way that it doesn’t even look like an actual advertisement anymore
which further constitutes to the invisibility of the underlying mechanisms of Face-
book. But as Facebook frames it in the popular discourse, the advertisement “gives
you the chance to connect to the companies and brands you like and learn more
about their products and services” (Sandberg 2010). This framing disguises what is
10
actually going on in the back-end and the underlying mechanisms.
Having discussed the problematic issue of the invisibility of the technology it is im-
portant to note that technology does not stand on its own. The regulation, control
and exploitation of the user-generated data happen in conjunction with the privacy
settings, the licenses, policies, the users activities and also the public discourse.
This was already implied when discussing the technological aspects. Furthermore
as Petersen noted “[i]t is when the technological infrastructure and design of these
sites is combined with capitalism that the architecture begins to oscillate between
exploitation and participation.” In this respect, Facebook can also be seen as a
playground and a factory, which is something that was explored by Trebor Scholz,
but his focus lies more on the digital labour and the problematic aspects thereof.
Facebook as a platform can be seen as a playground in the sense that the users like
to be on the platform and socialize with other users. However, this notion of a play-
ground can also be viewed in the following manner. The technical design (back-
end and front-end), licenses and policies and the platform owners controlling the
user-generated data, the users themselves, are all dynamically intertwined and are
actors on the playground that is Facebook. This while the users occasionally leav-
ing the playground to go to another website while their movements and therefore
also their labour is being tracked by the underlying mechanisms of Facebook. The
playground in this respect is the field on which the various actors dynamically in-
teract with each other.
Facebook can be seen as a factory in the sense that the actions of the users gener-
ate value for this platform and other companies as well. The labour actions have
shifted to places where it doesn’t look like labour anymore (Scholz 2010, 242). This
article focused more on the facilitating of this labour by means of the technical
design and brought forward the ways in which the underlying mechanisms of Fa-
cebook are regulating the user-generated data. And it is argued here that this hap-
pens in ways that have often become invisible for the users. Problematic about the
invisibility of the underlying mechanisms is the lack of control that it constitutes,
the uncertainty of what exactly is done with this data and what exactly is collected.
As well as the way that the technology is steering the users activities, so that the
data that is generated with their activities can be used for profit by the platform
owners. The continuous development of the technology transforms the ability to
aggregate the user-generated data. The technology has the ability to capture, ag-
11
gregated and redistribute the data of the users. The owner of this data is often una-
ware of the storage and utilization and the detail of the profile that is created of the
user (Spärck Jones 2003, 4-5). It is therefore important to continue doing research
and the playground will certainly be an interesting field for further research into
the various dynamic interactions between the actors at play.
12
[1] User-generated data here is seen as filling in the profile information, a status
update, or even accepting a friend request (i.e. creating friend connections), any
action that improves the database infrastructure of Facebook. Not to be confused
with User-generated content, which encompasses more than just data, it also en-
tails uploading of photos and videos.
[2] In a presentation on October 8, 2009, Jeff Rothschild stated that Facebook has
30,000 servers to support their operations. Since this presentation Facebook has
expanded their user base even further but current figures remain unknown.
[3] An Application Programming Interface is “a set of calling conventions defining
how a service is invoked through a software package” (RFC 1208). It determines
the way in which the application program is communicating with the operating
system or database, it can therefore be seen as an intermediary between the ap-
plication (from for instance third party company) and the database and infrastruc-
ture of Facebook.
Notes
13
Andrejevic, Mark. 2002. The work of being watched: interactive media and the exploitation of self-disclosure. Critical studies in media communication, vol 19. no. 2, june p 230-248
Andrejevic, Mark. 2009. “Privacy, exploitation, and the digital enclosure” Amster-dam Law Forum 1.4 http://ojs.ubvu.vu.nl/alf/article/viewArticle/94/168
Clarke, Roger. 2009. Fundamentals of Information Systems. Roger Clarke’s web-sitehttp://www.rogerclarke.com/SOS/ISFundas.html
Facebook. 2010. Terms of services. Facebook website, October 4. http://www.facebook.com/terms.php
Facebook. 2010b. Privacy policy. Facebook website, October 22. http://www.facebook.com/policy.php
Gandy, Oscar. 1993. The panoptic sort: a political economy of personal informa-tion. Boulder, CO: Westview.
Kincaid, Jason. 2010. EdgeRank: The Secret Sauce That Makes Facebook’s News Feed Tick. Techcrunch Blog, April 22. http://techcrunch.com/2010/04/22/facebook-edgerank/
Madrigal, Alexis. 2010. “How the Facebook News Feed Algorithm Shapes Your Friendships” The Atlantic. October 22. http://www.theatlantic.com/technology/archive/2010/10/how-the-facebook-news-feed-algorithm-shapes-your-friendships/64996/
Petersen, Søren Mørk. Loser generated content: from participation to exploitation. First Monday 13, no. 3 (March). http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2141/1948
Rieder, Bernhard. 2007. “Sociodigitization and Facebook” The Politics of Systems, October 15.http://thepoliticsofsystems.net/2007/10/15/sociodigitization-and-facebook/
Rieder, Bernhard. 2005. Networked Control: Search Engines and the Symmetry of Confidence. International Review of Information Ethics, vol 3. (June)
Rothschild, Jeff. 2009. CNS Lecture series. University of California, October 8. http://cns.ucsd.edu/lecturearchive09.shtml#Roth
Scholz, Trebor. 2010. Facebook as playground and factory. Facebook and philoso-phy, what is on your mind? ed. D.E. Wittkower, 241-252. Chicago, Illinois: Open Court.
References
14
Segall, Laurie. 2011. Facebook’s ‘sponsored stories’ turns your posts into ads. CNN, January 26.http://money.cnn.com/2011/01/26/technology/facebook_sponsored_stories/in-dex.htm
Singel, Ryan. 2008. Facebook Beacon Tracking Program Draws Privacy Lawsuit. Wired Magazine, August 14. http://www.wired.com/threatlevel/2008/08/face-book-beacon
Slutsky, Irina. 2011. Facebook Test Mines Real-Time Conversations for Ad Tar-geting. Ad Age Digital, March 23. http://adage.com/article/digital/facebook-test-mines-real-time-conversations-ad-targeting/149531
Spärck Jones, Karen. 2003. Privacy: What’s different now? Interdisciplinary Sci-ence Reviews, 28(4), 287–292.
Tiziana Terranova, 2004. Network culture: Politics for the information age. Lon-don: Pluto Press.
Vogelstein, Fred. 2007. “The Facebook revolution” LA Times, October 24.http://www.latimes.com/news/opinion/la-op-vogelstein7oct07,0,6385994.story?coll=la-opinion-center
Weber, Thomas E. 2010. “Cracking the Facebook Code” The daily beast. October 18.http://www.thedailybeast.com/blogs-and-stories/2010-10-18/the-facebook-news-feed-how-it-works-the-10-biggest-secrets/full/
Zimmer, Michael. 2008. Facebook’s Zuckerberg on Increasing the Streams of Per-sonal Information Online. Michael Zimmer’s blog, November 8.http://michaelzimmer.org/2008/11/08/facebooks-zuckerberg-on-increasing-the-streams-of-personal-information-online/
Zuckerberg, Mark. 2010. Giving you more control. The Facebook Blog, October 6.http://www.facebook.com/blog.php?post=434691727130
15
This article examines how Discourse Analysis can be used to understand the Web.
The Web is conceptualized as a dynamic oligarchy of a variety of embedded net-
works which contain most users. Apart from having distinct embedded networks,
the Web has also been conceptualized as a political field by various scholars. To
explain the use of Discourse analysis, the method is first introduced. After this
introduction, parallels are drawn which illustrate the way in which discourse
analysis reflects the Web’s infrastructure. Hereafter, Discourse Analysis is helps
to combine the ontological claims of the web with the political claims of the web,
thus providing a theory of how the Web as a political infrastructure composed of
interacting layers works.
Keywords: heterarchical networks, scale-free networks, web-politics, Discourse
Analsys
16
Abstract
Liberating, oppressing, bringing people together, keeping people apart, the Web is argued to change our life in profound and conflicting ways; how does the web change people? This article discusses the web to try and understand it as a channel of com-munication and change. This entails discussing the Web as a piece of infrastructure, a network of communicating nodes and a political space. But that is an analysis of what the Web is, not how the Web works. To elaborate on the working of the web, this ar-ticle argues for the use of discourse analysis. To show this, this article draws parallels between discourse analysis’ conceptualization of social networks and the Web as an ontological thing, which argues for its fit as an epistemological tool for analysis. While going deeper down the rabbit hole we can see the interplay between the actor, the web and the discourse within and between the two. This article thus builds a discourse to describe the power-shaping discoursal practice of the web1.
This Network thing called ‘the Web’
A Network can be many things. The basic definition given by Van Dijk (2006, 24) makes networks out to be: a collection of links between elements of a unit”(emphasis original). Especially Social Networks, i.e. networks between people, play a pivotal role in societies (Van Dijk 2006, 26). One of the most modern of these networks is the Web.
Galloway explains the web is a layer of conversation contingent upon the working of more technical supportive layers (Galloway 2004). The supportive layers called pro-tocols determine how information should be communicated, where it should go, how it should get there and whether it has successfully been communicated. This is done without reference to a specific entity or node in the network. The absence of functional differences in nodes makes this network a distributed network instead of a hierarchi-cal or decentralized network. Additionally, the content of information communicated in this network is not considered by these protocols. As such, the supportive protocols which frame the web make it both distributed in regards to the communication pro-cess and neutral in respect to content (Galloway 2004, 46). In relation to the web, these protocols underpin the idea of the web as a zero-institution described by Dean (Dean 2003, 106). This concept entails a social institution without a normative claim, but which may contain normative claims.
1 Interestingly enough, Phillips and Jorgensen reject the structuralist metaphor of a fishing net, with set relations, and instead prefer the use of the internet as a model (2002, 11).
This Network thing called ‘the Web’
17
Bob van de Velde
If the web is considered to be a zero institution, there is no a priori normative differ-ence between content providers. Such a difference would contradict the very concept of a zero institution. Still, Shirkey finds a de facto difference in traffic and attention for certain content providers (Shirkey 2006). This difference doesn’t result from techni-cal difference, such as data-usage or availability. Instead, the difference is normative, because it is grounded in the preferences of users (Shirkey 2006, 38). The difference in traffic, attention and income between content providers is so strong that roughly 20% of the providers generate 80% of the traffic, attention and income. Such a dis-tribution isn’t the result of non-egalitarian system, but of large and unconstrained choice (Shirkey 2006, 42). This means that although each node can in theory equally contribute to content, some nodes are in practice dominant as content providers.
This non-equal distribution forms an emergent property of the Web. The landscape which results is quite different from a general rhizome structure. To illustrate this fact, Barabasi summarizes the results of empirical research concerning the Web:
“Researchers studying these huge samples have made some fascinating-discov-eries. They have found that the Web is fragmented into continents and com-munities, limiting and determining our behavior in the online universe.” (2002, 162)
The fragments of the Web can be seen as embedded networks. Examples are social network sites (SNS) such as Myspace and Facebook, or forums and user-groups. These distinct provinces are dependent on the broader frame of the Web, but form platforms of communication distinct from the broader Web. As such they are net-works within the Web, which draw users from the broader Web into their boundaries. When considering the unequal distribution of connections, such websites form nodes with a superior amount of traffic, attention and income. Within the context of the Web, this situation can be seen as an oligarchy, in which few providers connect (to) the majority of users of the Web, by drawing them into their bounded territories; thus forming what I call embedded networks (ENs) .
Despite being central in terms of connections and closed in terms of internal rules and practices, these embedded networks are not fully stable. Torkjazi (2009, 6) writes: “existing [social network sites] appear to be very vulnerable to the arrival of new “fashions” among users ...” . Similairly, the underlying layer of the Web, the internet, can be used as a censorship tool. Such intervention ‘from below’ can be seen in for instance China (Diamond 2010, 73). Combined, this means that ENs can be influence by the users within them and the internet underlying them. As such, the Web and the ENs therein are part of a heterarchie (Kontopoulos 1993, 55). This is a type of network within which layers are intertwined, and can influence each other, thus creating influ-
18
ence from below, above and within a specific layer. More importantly, this argues for a dynamic oligarchy within the Web, rather than being dominated by a static set of ENs.
Web-Politics
So far, differences in the Web have been discussed in terms of connections, but not politics. Returning to Dean’s idea of the Web as a zero-institution, the Web is open to the introduction of values. For Dean, the Web as such equates to a country, which doesn’t have a specific set of dominant values, but in which values are determined by the people within them (108). The Web in this respect becomes a space in which dif-ferent values and beliefs meet. Friedland (2004) claims the Web as such is an impor-tant source of socialization:
“The model of the well-socialised individual capable of communicatively ratio-nal action is, in fact, poised between primary socialisation in the family and sec-ondary socialisation in the world of institutions. The transformation of second-ary institutions – the schools, community associations, indeed the family – into networked environments has created a secondary lifeworld in which the media itself becomes a major source of socialisation. “Life online” is more than a meta-phor for those under 35 (and many over). It is a new form of life that influences core forms of intersubjective communication and sociation” (2004, 23)
Considering the Web as a source of socialization, many actors on the Web use it as a channel through which they can communicate their values. Such political use of the Web can be seen in the work of Marres. Marres describes Issue Networks and their use of various ICT’s, such as the Web, to describe what she calls ‘info-politics’. Info-politics means using ICT’s like the Web to format issues in specific ways, thus attempting to portray a specific position as favorable (2006, 15).
Like the Web is not stable in regard to the most connected parties, neither is it in re-gard to which values dominate. Such a system is not democratic, i.e. working toward a shared opinion for the majority of the people. Instead, Dean argues such a system is a neo-democracy, in which conflicting positions struggle for (temporary) dominance (2003, 108). This struggle doesn’t just focus on the beliefs of people, but the funda-mental beliefs, identities and practices of individuals. This makes the Web a battle-ground of ideas, a strongly political space.
Introducing Discourse Analysis
To recap what has been said: The Web is not a whole of equally connected nodes, but a dynamic-oligarchy in which different EN’s dominate communication with and
19
between sub-sets of Web-users. Additionally, this dynamic-oligarchy called the Web is a political space, where struggles over beliefs identities and practices occur. But it is one thing to say that the Web is an unequal space of political struggle, it something else entirely to say how this works. Enter Laclau and Mouffe’s discourse analysis, an epistemological tool which provides a framework which explains how communica-tion works as a politics. First its parallels with the Web as a thing are drawn out. This helps understand how the nature of the Web as a dynamic Polyarchy reflects the con-ceptualization of Discourse Analysis. Secondly, a synthesis is made which enables an analysis of the web as it works, rather than as what it is.
Parallels
A discourse can be many things, Phillips and Jorgensen (2002,11) argue “signs (…) acquire their meaning by being different from other signs, but those signs from which they differ can change according to the context in which they are used”. Ultimately, Laclou & Mouffe’s discourse theory supposes all societal formations are products of discursive processes (2002, 34). Discourse as such is a web which structures meaning.
Laclau and Mouffe claim the meaning of signs which can be both words and behaviors is created by putting words and behavior, the signs, in relation to other signs. The process of making these connections is articulation, which is an act of forming the relations between signs. In these relations, a sign is transformed from a meaningless element to a meaningful moment. Which is a sign laden with meaning based on the constructed relations with other signs. In these relations, there are central moments around which other signs are ordered; these are called nodal points. Like an Em-bedded Network exists around a specific provider, a disourse exists around a nodal point. An example of a nodal point is ‘ANT’, which attributes a certain meaning to the element ‘actor’. In this case actor means ‘a thing which acts on another thing’, ‘instituting change by itself’ (Latour 2005, 150-153). A competing nodal point would be ‘theater’ , which attributes a different meaning to the element ‘actor’ by tying it to elements such as ‘person’, ‘part’ and ‘script’, thus making it a different moment. As the example shows, there are multiple nodal points competing over the meaning of specific elements. The meanings of an element not included in a discourse form the field of discursivity. These are all possible other meanings attributed to a certain element. Bringing these other meanings into a discourse, for instance by different ways of framing, creates conflict. This conflict over meaning is called politics, whereas meanings beyond dispute are called objective. Discourse as such creates a space in which values are made from otherwise neutral elements, which forms a process of socialization akin to that discussed by Fiedland (2005).
If Discourse is considered to be an ordering tool which combines otherwise neutral el-
20
ements, there seems to be no a priori difference between these elements. Still, in prac-tice some elements, or discourses, are dominant. In these discourses, elements have attained closure, their meanings as a result of discourse are no longer challenged. These elements have become nodal points, thus ordering a discourse. As the example shows, multiple discourses can compete over the meaning of specific elements, like ‘actor’. But it is the nodal points like ‘ANT’ or ‘Theater’ which struggle over the ele-ment ‘actor’. This makes the nodal points around which these moments are ordered are more influential than other elements which have not attained the status of nodal points. This means that although each element can in theory equally contribute to meaning, some elements are in practice dominant as nodal points, just like all the nodes on the Web could equally contribute, but in practice don’t.
This non-equal status of elements forms an emergent property of discourses. The landscape which results is quite different from a general rhizome structure. Nodal points have conflicting articulations of specific elements. These articulations happen in a broader frame of signs which have not yet been articulated by these nodal points, the elements. This broader field is called the field of discursivity. This field is the other of a specific discourse. It contains both elements, but also competing discourses (Phil-lips & Jorgensen 2002, 27). Within the context of the field of discursivity, this situa-tion can be seen as an oligarchy, in which few nodal points connect (to) the majority of signs; thus forming Discourses .
Despite being central in terms of connections and closed in terms of internal mean-ings and values, these discourses are not fully stable. The competing discourses in the field of discursivity can challenge floating points which create another discourse, thus attempt to change it. This makes discourses “incomplete structures in the same undecidable terrain never quite become completely structured”( Phillips & Jorgensen 2002, 29). As such, the discourses within the broader frame of discursivity are never fully stable. More importantly, this argues for a dynamic oligarchy within the field of discursivity, rather than being dominated by a static set of discourses.
This parallel shows the way in which Discourses are in the field of discursivity what ENs are within the Web. For theoretical purposes, this helps to understand how a shift in one is similar to a shift in the other. It also helps to show these networks are both dependent on hegemony, either in the more connected nodes or influential signs. In both systems, there is a struggle, but the next part will argue that the struggle in dis-course can be used to understand the struggle within the Web.
Discourse - politics
Having established the nature of discourses as a parallel to the nature of the web,
21
and the struggle over meaning, the question is how this struggle is in fact political. As has been discussed, there are multiple discourse within the field of discursivity which struggle over meaning. But the signs in discourse denote not only words and behav-ior, but also identity (41). As such, the subject of a discourse is fragmented, he or she can have different roles in different discourses. When a discourse is used between one actor and the other, there exists interpellation, which means individual is placed in a certain position by a discourse (Phillips & Jorgensen 2002, 40). A twitter user is interpelated as someone who needs to communicate small texts. A user thus exists as a subject of different discourses, which carry their own norms of behavior. This makes the user over-determined, he or she can be positioned in several different discourses. Some of these discourses can have conflicting positions, as it is hard to be a privacy advocate and Facebook user at the same time.
The Discourses socialize because of the relations the put between positions and be-havior. These relations can be both linked and oppositional; they determine what goes together and what excludes each other. But these discourses also play a central role in group-formation (Phillips & Jorgensen 2002, 43-44). This is a process of identifica-tion between a subject and a group signifier, for instance twitter-user. With this group identifier, which becomes a master signifier to the subject, there come meanings re-lated to it by discourse. This includes the practice of frequent little texts, transparent updates etcetera. But a group identifier can also be linked to for instance green-peace sympathizer, leftwing-voter or anti-communist. The user in an EN is interpelated to certain behavior as part of that EN. Additionally, ENs provide channel of discourse in which discourse can struggle over the identity of users.
Because of the heterarchical nature of the dynamic-oligarchy of the web, discourse politics function in at least three distinct ways: The interactions between users fa-cilitated by ENs, ENs influence on user discourse, and the user-discourse influence on ENs. Discourse Analysis (DA) provides a method of analysis to see the working of these influences.
EN’s influence on Inter-user discourse
Using the vocabulary of DA, discourses are networks forming political struggles over the meaning of words and behavior. These struggles happen in the acts of actors, in the process of articulation. More practically speaking inter-human network can be considered a stage in which the articulations of users introduce their discourse to the field of discursivity of other users. This communication of discourse influences the discourse-network within a user. In this way, DA provides a vocabulary to describe the way in which the words and behaviors of one user influence the meanings of another user, by talking about the interplay between the inter-user network and the under-
22
lying discourse-networks. In short, the Discourse- User- Network Interplay (DUNI) describes the interaction between inter-actor communication through ENs effects on individual user meanings and behavior.
Using DUNI, it is possible to describe in the detail how the position of central nodes serves as a position of power. Because of their many connections, their discourses can spread more easily throughout the network, thus changing the behavior of many other actors. This spreading of discourse can make objective moments political, for instance by introducing a competing discourse to the relation between a person and a brand or a government. As such, central nodes act upon other the possible actions of other actors. DA thus shows how the socialization and issue formation is generated through the ENs on the Web.
ENs influence on inter-user-discourse
Apart from facilitating inter-user discourse, ENs often engage in direct communica-tion with their users. One way of doing this is through adverts. Examples of this are smart advertisements. These use the words of users, or sometimes even pictures of their friends to generate targeted ads. To understand how this works to change users, DA provides a valuable framework. Both methods of communication use moments from the discourse of the user to entice action. This is because using moments the user expresses by searching, and tying it to other actor, which pay for this service, by placing a literal link. As such, paying actors are positioned as additions to the user discourse. Instead of politicizing the user’s discourse, it is added to, thus expanding the discourse with elements which were previously devoid of meaning. Thus, the user is introduced to other actors by fitting the actors into the users discourse, a way of formatting this discourse.
ENs effect on user-discourse
A more stringent matter is the effect of ENs on user behavior. Longford (2005) argues that through web sites, like those which form ENs, change user’s behavior in order to harvest personal information. Once again, DA can provide a method of analysis for this behavioral change or lack thereof by looking at the way in which ENs contribute to individual discourse. Considering the user who is at first unwilling to yield to such a system. This user is confronted with a struggle between a discourse in which the ac-tion of trying to join, as a sign, is connected to diminished privacy. To accept this ac-tion, there should be an absence of a dominant discourse in which privacy is positively related to personal identity. If the user chooses not to join after being confronted by the dismissal of privacy in exchange for system use, this means the discourse which positively relates to privacy has attained hegemony at the expense of the discourse
23
relating to access, and vice versa.
Yet there also exists the possibility that the user doesn’t value privacy, not even nega-tively. In this case, a DA analysis would point out that privacy is an element, not a moment for this user. This leads the user to carry out the action of joining the network without struggle. Only when a conflicting discourse from the field of discursivity in-troduces a different order in which privacy is valued, there will be no change in the practice of usage. This highlights the nature of change in behavior instituted through ENs. Their use entails the inclusion of practices in individual discourses. As such, ENs engage in the conduct of conduct, creating changes in behavior of their users. DA helps to show how ENs structure user conduct through restructuring discourse, and how introductions of new discourse may challenge such relations.
User-discourse effects on ENs
EN’s like social network sites are contingent upon their use. As explained in the de-scription of the Web as an infrastructure, ENs are part of the broader Web. Like Tork-jazi (2005) argues, social network sites are an example of ENs which seem to have a life-cycle after which they are abandoned by their users and ultimately vanish. As was argued, this is what makes the oligarchy of the Web dynamic. DA can be used to explain how the users’ use of ENs can be changed. DA sees behavior as part of discourse. As such, practices of using particular ENs are part of a broader framework of meanings generated by the relational ordering of signs. Theoretically, this means that the use of for instance Myspace vis-à-vis Facebook is part of the identity of us-ers. Hargittai (2008) sees that the choice of social network is correlated to ethnicity. This indicates an influence of identity on the practice of social network site usage. Should the discourse of users change, through a change in identity, or the relation between identity and the social network site, this could entail a shift in the practice of usage. When a different service enters the discourse of users by challenging the role of the previous social network site, this creates a politization of said moment. As such, discourse is the way in which users stick to or abandon certain ENs. But rather than analyzing them as the result of ‘fads’, DA provides a way to trace the restructuring of meaning as a cause for such changes.
Conclusion
This article introduced Discourse Analysis as a complementary theory to an analysis of the Web. Did it succeed? The conceptualization of discourses is similar to the Web. The groundwork is neutral and equal, but in practice used by dominant structures which are mutually differentiated. In both systems, these dominant structures are subject to change facilitated by their neutral groundwork. The added value of DA then
24
lies in the way it describes the role of discourses as structures which include individual behavior. This helps to make the ways in which power acts throughout networks ex-plicit. And from that analysis result paths of both user and EN control. Three levels of these processes where given as theoretical analyses of the political phenomena on the Web. These examples argue for the applicability of DA when analyzing the Web. Accepting this thesis implies doing, simply put, this article argues that DA should be empirically applied to gain insight into if and how discourses shape behavior. Perhaps more importantly, how this shaping generates and maintains ENs and how users are affected by these ENs. In short, what are the politics which create, maintain and em-power the ENs.
25
Barabási, Albert-László. 2002. Linked: The New Science of Networks, 143-178. Cam-bridge, Massachusetts: Perseus Publishing.
Dean, Jodi. 2003. Why the Net Is Not a Public Sphere. Constellations 10, no. 1: 95-112.
Diamond, Larry. 2010. Journal of Democracy 21 (3) . 69-83. The Johns Hopkins University Press.
Dijk, van, Jan. 2006. Networks: The Nervous System of Society. In The Network Society, 19-41. London: Sage Publications.
Friedland, Lewis A., Hernando Rojas, and Thomas Hove. 2004. The Networked Public Sphere. Javnost - The Public 13, no. 4: 5-26.
Galloway, Alexander R. 2004. Protocol: How Control Exists After Decentralization, 1-53. Cambridge, MA: MIT.
Hargittai, Eszter. 2008. Whose Space? Differences Among Users and Non-Users of Social Network Sites. In Journal of Computer-Mediated Communication 13. 276-297.
Jorgensen, Marianne and Louise Phillips. 2002. Laclau and Mouffe’s Discourse Theory in Discourse Analysis as Theory and Method 24-59 Londen: Sage Publica-tions.
Kontopoulos, Kariakos M. 1993. Logics of Social Structure. Cambridge University Press. Cambridge.
Latour, Bruno. 2005. Reassembling the Social: An Introduction to Actor-Network-Theory, 121-156. Oxford: Oxford University Press.
Longford. „Pedagogies of Digital Citizenship and the Politics of Code.” Techné, 2005: 68-96.
Marres, Noortje. 2006. Net-Work Is Format Work: Issue Networks and the Sites of Civil Society Politics. In Reformatting Politics: Information Technology and Global Civil Society, ed. Jon Anderson, Jodi Dean,
and Geert Lovink, 3-17. New York: Routledge.
Shirkey, Clay. 2006. Power Laws, Weblogs, and Inequality. In Reformatting Politics: Information Technology and Global Civil Society, ed. Jon Anderson, Jodi Dean, and Geert Lovink, 35-42. New York: Routledge.
Torkjazi, Mojtaba, Reza Rejaie & Walter Willinger. 2009. Hot Today, Gone Tomor-row: On the Migration of MySpace Users. WOSN’09, August 17, 2009, Barcelona, Spain
References
26
Habermas’s late theory of the public sphere is fundamentally about democracy and
growing complexity. As new network forms arise how can Habermas’ fundamental
theory of public sphere still be adapted? The new network forms that are recently
under discussion are Web 2.0 platforms. Within Web 2.0 platforms, users demand
a cultural freedom. The cause of calling for freedom is the emergence of a combi-
nation in socio-technological-political processes within Web 2.0 that situates the
users as an issue network and turns users into something that is more like citizens.
As the public sphere is not inherently public, this paper elaborates on the disabling
and enabling factors of public sphere and how users turn into citizens within Web
2.0.
Keywords: public sphere, Web 2.0, network, participatory culture, socio-politi-
cal, socio-technology.
Abstract
27
Introduction
This sweetheart here, this little baby, looks like any ordinary machine, isn’t
that so? A mess of screws and buttons, a whole heap of plastic. Comes with
new words too: RAMS and ROMS. Think that’s what the machine is made of,
do you—the hardware and the software and the mouse? Not a chance. The
computer is made of you, lady. It’s got you all inside it.
(Time Magazine 1983)
In 1983, Time Magazine’s cover demonstrated the personal computer as the
‘Machine of the Year’ including the under title ‘The Computer Moves In’, which
formed the introduction for the information society. The entering of a new phase
where the creation, distribution, and manipulation of information becomes
a significant cultural, political, and economic commodity. The reflection of
commodity was argued by Jean-François Lyotard who stated that “knowledge has
become the principle force of production over the last few decades” (Lyotard 1984,
5). Information technologies’ diffusion into society puts forward the visibility and
accessibility of knowledge to the public. In other words: “technology enters the lives
and homes of common users as the microcomputer” (Schäfer 2011, 9). The notion
of technology as part of everyday live was symbolized in 2007 by Time Magazine’s
different version of ‘Person of the Year’. On the cover the common user became
‘the person of the year’ describing the user is in control of the information age.
Schäfer refers to the “emergence of a new global cultural practice” (Schäfer 2011, 9)
that is derived from the shaping form of computer use, and calls this ‘participatory
culture1’ (Schäfer 2011).
Major critiques of the concept information society argue that it creates a certain
sense of ‘romanticizing’ society by referring to something that is completely
new. One of the critics is Frank Webster who argues that information society is
a continuation of contemporary society and, therefore, it should not be seen as a
radical transformation of society (Webster 2002).
1 The term participatory culture was first coined by Jenkins (1991, 2006a; 2006b, Jenkins et al. 2006). Schäfer describes the term as the advocating of social progress through technological advance-ments, whereas power relations are reconfigured and related technologies are used for design and user appropriation, and this all derives socio-political dynamics (Schäfer 2011).
Citizens of Web 2.0: Public sphere as cultural public
28
Lindsy Szilvási
To return to Schäfer’s reference of participatory culture and to follow Webster’s
notion of society’s continuation, the society itself is not transformed but the users
have gained a new role. In the context of cultural practice, the users transform into
active participants and acts within the production of media. In 2005, BBC News
website2 covered the topic of digital age and the creativity by users, and asking
them “Are you a digital citizen? We want to hear about your digital life and how
you use technology”. The user as digital citizen is, for example, involved in the
creation of blogs, the use of social network sites, and the posting of pictures and
videos online. Digital citizenship is derived from developments of technologies and
can collectively be called Web 2.0. Web 2.0 can be understood as a conceptual
framework of distribution, interaction, and converging of media formats. Tim
O’Reilly marks the outlined definition of Web 2.0 as a platform whereas the user
controls and generates content, reconfiguring the position of the user as active
producer (O’Reilly 2005).
Digital citizenship reflects the social progress through technology, transforming
users as active participants. Pierre Lévy nuances the degree of participation by
the emergence of ‘collective intelligence’ – a meeting of minds on the Internet
– referring to the collaboration of individuals (Lévy 1997). In the context of
media production, active participants generate, produce, publish, and distribute
content that is provided by companies and institutions. Schäfer states that “user
participation is an extension of the cultural industries” (Schäfer 2011, 11) involving
a collaboration between users and companies.
Wikipedia is an example of collective production, though it is often criticized in
the issue of cultural freedom. According to Benkler, “cultural freedom occupies
a position that relates to both political freedom and individual autonomy, but
is synonymous with neither” (Benkler 2006, 274). The notion of participation
holds several consequences for companies and politics, requiring a socio-political
understanding of participation. Participation involves decisions and actions that
are seen as legitimate to users but can be interpreted as not appropriate for politics
and companies. According to Lévy (1997), and Schäfer (2011), participation
establishes power structures where users and politics, and users and companies,
are complementing or opposing each other.
The relation between users and politics, and users and companies, forms control
and regulation of media production, and derives the discussion of democracy. 2 For the complete article see: http://news.bbc.co.uk/2/hi/talking_point/4678631.stm
29
According to Lessig, who questions the political impact of new media, there is
already the sense of antidemocraticy (Lessig 2000). In The Digital Revolution, the
Informed Citizen, And The Culture of Democracy by Jenkins and Thorburn describe
Pool’s framework of communication technologies and democracy envisioning a
decentralized and participatory media environment where technologies of freedom
will attempt to create control (Jenkins & Thorburn 2003).
Habermas’s late theory of the public sphere is fundamentally about democracy and
growing complexity. As participatory culture involves the active role of users within
cultural production, the question is if Habermas’ fundamental theory of public
sphere can still be adapted. An aspect that as being neglected and not prominently
enough theorized is the quality of Web 2.0 platforms as public space (Schäfer
2011). Web 2.0 platforms are something that is a public space, but somehow that is
not. Within public space, users demand a cultural freedom but with the emergence
of socio-political processes on Web 2.0 platforms users turn into citizens (Schäfer
2011).
The contemporary stress on participatory culture discusses two different sides:
public sphere’s enlargement, and user surveillance commodification (Lister et al.
2009). The revision of public sphere by Web 2.0 is an important theme within
scholarship and public net discourse, and is compared with Habermas’ fundamental
theory of public sphere. New forms of electronically mediated discourse are taken
into account in the discussion of democracy in the age of public sphere (Poster
1997).
This article argues that participation within socio-political processes of Web 2.0
will transform the user into a citizen. In view of Habermas fundamental theory,
and Schäfer’s participatory culture this article will build upon the notion of public
sphere within Web 2.0 in regarding what disables or enables ‘public’ in terms of
participation.
Cult of Public Opinion
Participation and public sphere are two closely related concepts, both depending
on technology, economy, culture, and politics to lower the barrier of participation
essentially for the creation of a public sphere. To demonstrate how participation can
influence public sphere within Web 2.0 platforms, first Habermas conceptualization
of public sphere will be discussed, and secondly, how participation within Web 2.0
30
derives socio-political processes.
Friedland et al. (2006) provide a revision of public sphere and refer to Habermas’
Between Facts and Norms (BFN) that gives a more recent concept of public sphere
than in Habermas’ earlier work The Structural Transformation of the Public
Sphere. In BFN Habermas describes public sphere as multiple publics toward
increased fragmentation and privatization where political and economic systems
increase in complexity and autonomy (Habermas 1992). Habermas’ definition of
public sphere reflects the growing centrality of network, and according to Friedland
et al. this centrality is caused by the existence of autopoietic dynamics within
networks (Friedland et al. 2006). Autopoietic dynamics means that networks have
a self-organized character. The self-organizing character of network governs the
aspects of public sphere balancing between an open system and institutionally
constrained. An open system’s character is deliberation including self-regulation
and communicative reflexivity, whereas a public sphere that is institutionally
constrained involves mediated communication dominated by politics and elite
discourse (Friedland et al. 2006).
The involvement of mediated communication reflects the influence of politics
that limits the ‘public’ in public sphere. According to Mejias, “the public is where
opinion can be expressed freely and at the same time informs action” (Mejias 2010,
606). The expression of opinions is freely and includes action in participatory
culture as users produce, generate, and distribute information. Due to easy-to-use
interfaces within Web 2.0 platforms every user can become a participant, which
makes Web 2.0 platforms a space of inclusivity. Inclusivity is one of Habermas’
four aspects of democratic practice in conceptualizing public sphere. The other
aspects are: equality, transparency, and rationality. Dean argues that the danger of
inclusivity can be seen as disempowerment of intellectuals. In participatory culture
every user can become a publisher of content regardless the value of authenticity
and credibility. An example of intellectual disempowerment within a Web 2.0
platform is the online encyclopedia and open-source operating system Wikipedia.
Wikipedia allows user to produce, edit, delete, and share information. A critic
of online encyclopedia like Wikipedia is that posted items include errors, lack of
expertise, plagiarism, and copyright infringement. Even there exists critics about
Wikipedia, it constitutes as an extension of cultural industries required by user
activities (Schäfer 2011). Wikipedia demonstrates how technology advancements
make social progress possible.
31
The social progress through technological advancements within Web 2.0 describes
the notion of socio-technological advances. Scholz, though, argues that Web 2.0
does not represent socio-technological advances but deflates the claim of user’s
empowerment as intellectuals still define what enters and what not enters the
public discourse (Scholz 2008). Noam Chomsky (2011) referred to the justification
of power in the role of intellectuals on his conference of Responsibility And
Integrity: The Dilemmas We Face on 15th of March 2011, as that intellectuals
have the privilege and opportunity to narrow the public discourse. Web 2.0 as
postmodern communication space perceives new structures of power and authority
(Listers et al., 2009). As Web 2.0 blurs the boundaries of social, political, and
ethical dimensions (Zimmer 2008), power structures of public discourse perform
an important role for equality and visibility. Users become blind to the ideological
meaning of technologies (Postman 1992) as the power and ubiquity of Web 2.0
rises.
To gain user’s articulation and collective power, participation within Web 2.0
provides cultural expression in media practices (Jarret 2008). Web 2.0 enables
technologies to empower disadvantaged users, providing the possibility for
discussion and political debates wherein public opinions can take place. The
communication process flow between political systems and users is described
by Habermas as: public opinion generates influence; influence is transformed
into ‘communicative power’ through media channels; and communicative power
is legislated into ‘administrative power’ via democratic procedures (Habermas
1994). In other words, communicative power is persuasive communication
created by users itself to influence the political system and corporations in form of
participation, including public opinions and cultural expressions.
Communicative power is the side of counter steering that involves the cooperation
and mutual understanding of users, whereas administrative power forms the side
of steering, which describe the activity of politics and companies to influence user’s
cultural freedom (Friedland et al. 2006). In other words, communicative power
involves the freely expression of users but at the same time it provokes reaction of
politics and companies who struggle to control the user’s participation.
The interplay between users as participants and the mediated communication
dominated by political systems and corporations opens the discussion for socio-
political processes in Web 2.0 platforms. Web 2.0 platforms enable technologies to
32
empower disadvantaged users creating participatory culture, thereby establishing
a place to form communicative power as an activity of resistance towards political
systems and corporations. Senft suggests that participatory culture is a ‘cult of
public opinion’ (Senft 2000).
Zuckerberg for President
As participatory culture is a cult of public opinion, Web 2.0 platforms provide a
place for public discourse, petitions, and civil acts. An example of public discourse
and civil acts is the reaction to privacy settings within Web 2.0 platforms when the
platform’s settings are forced upon the users, and users start to show resistance
in an old fashioned way such as boycotting or petition in order for the platform to
change the rules. In this paragraph, Facebook as an example of a Web 2.0 platform
that on the one hand encourages users to participate within the company’s decisions
but on the other hand creates this notion of governance.
Facebook is like a public policy website demonstrated by a video of Facebook’s
CEO Mark Zuckerberg. In 2009 Mark Zuckerberg posted on the Facebook’s
page Facebook Governance Site a video requesting Facebook users to vote which
documents3 should govern the Facebook site. Facebook users were asked to provide
feedback on two new documents: the new Facebook Principles and the Statement
of Rights and Responsibilities.
A community that large and engaged needs a more open process and a voice
in governance. That’s why a month ago, we announced a more transparent
and democratic approach to governing the Facebook site.
(Zuckerberg on Facebook Governance Site 2009)
Zuckerberg’s quote reflects openness of Facebook as a community where users
can be part of, enhancing the feeling of participation and inclusivity, and the
announcing of a transparent and democratic approach demonstrates Facebook’s
concept of visibility. The aspects inclusivity and visibility are two of the four
aspects of Habermas’ democratic practice in conceptualizing public sphere, which
are mentioned before but are not valid when it comes to the public of Web 2.0. Due
to socio-political processes within Web 2.0 platforms, there is still the interplay
between users’ communicative power and the mediated communication dominated
3 The changes made in the Privacy Policy documents after the voting can be seen at: http://www.facebook.com/fbsitegovernance?v=app_4949752878
33
by political systems and corporations.
Criticism about the ‘voting on Facebook’ reflects the existence of mediated
communication by the corporation, claiming that participation is not entirely
established. There was apparently a small catch within the voting process:
For this vote and any future one, the results will be binding if at least 30
percent of active Facebook users at the time that the vote was announced
participate. An active user is someone who has logged in to the site in the
past 30 days.
(Zuckerberg on Facebook Governance Site 2009)
The idea of establishing thirty percent participation threshold is according to critics
an unattainable fact. Privacy International’s director Simon Davies argues that the
voting is a fraud: “If this is a genuine attempt to give users control then give them a
genuine vote, not a symbolic one; otherwise, stop wasting everyone’s time”.4
The attempt of Facebook to the concept of participation serves as a public
interface between the company and its users, explaining company’s decisions in a
collaborative development (Schäfer 2011). Interaction with participants engaged
in company’s decisions creates a socio-political process, whereby transparency
appears to be a crucial aspect in the establishing of a culture of governance
(Schäfer 2011). The above mentioned quote and the related critics reflect socio-
political consequences of user participation and how a corporation deals with it. In
Bastard Culture! How User Participation Transforms Cultural Production Schäfer
makes a distinction between confrontation, implementation, and integration of
participation. Facebook’s voting is an example of implementing ‘participation’.
Participation is put between brackets as it is not sure if Facebook neglected or
really took the user activities into consideration.
The video of Zuckerberg resembles a president talking to his population, where
‘the population of Facebook’ is invited into a democratic action. Democratic action
involves organized activities of user, which turns the user into citizens (Schäfer
2011). As Web 2.0 platforms, such as the example of Facebook as demonstrated,
interplay between users, and political systems and corporations exist in order to
maintain participation in cultural industries and media practices. Dahlgren put
4 Found at: http://www.readwriteweb.com/archives/facebooks_site_governance_vote_a_massive_con.php
34
emphasis on the importance of socio-cultural interaction, as that participation in
political discourse forms a balance between political systems and civil society and
is a precondition for democracy and creation of public sphere (Dahlgren 1995).
¡Viva La Resistance! Conclusion
As users demand a cultural freedom, the emergence of socio-political processes on
these Web 2.0 platforms constitutes the user as citizen. The ‘public’ sphere refers
to the mutual understanding and cooperation between users when it comes to a
save establishment of participatory culture. Public is alive and well, although it will
never be quite the same as Habermas’ fundamental theory of public sphere. Due to
technology advancements and Web 2.0 platforms, the public sphere is revised as
a cultural public where power structures are reshaped by the blurring boundaries
between social and political. The public sphere is threatened by power structures
that attempt to control the participatory culture of users but at the same time users
can accomplish change and resistance by communicative power.
The four political norms of Habermas to conceptualize public sphere are partly
adapted in the cultural public. The ‘public’ is not formed by visibility but the
existence of mutual understanding and cooperation between users, and the act
of resistance combines the users together as public. Inclusivity will remain as
technology advancements within Web 2.0 provide easy-to-use interfaces for each
user to become a participant. The third political norm equality is still in need for
discussion, as participation in this paper is not valued in different levels. The
example of Wikipedia, though, demonstrates a blur between the posting of content
by common users and intellectuals, where participation exists regardless the value
of authenticity or credibility.
The social-political processes within Web 2.0 platforms form interplay between
users, and political systems and corporations where transparency of both groups
is a crucial aspect to establish a culture of governance, and how user participation
can be of extended value. In terms of participation, Web 2.0 platforms can be
conceived as a playground where on the one hand cultural public tries to influence
political systems and corporations by participation, and on the other hand political
systems and corporations will confront, implement, or integrate this participation.
A playground for the creation of cultural freedom, where participation is the
activity of playing, cultural freedom is the accomplishment of the play, and power
structures have to be defeated to be the ultimate winner.
35
Benkler, Yochai. 2006. Cultural Freedom: A Culture Both Plastic and Critical. The Wealth Of Networks: How Social Production Transforms Markets And Freedom, 273-301. New Haven, CY: Yale University Press.
Chomsky, Noam. 2011. Responsibility And Integrity: The Dilemmas We Face. Presented at the lecture for Social Responsibility Of The Artist. Utrecht University, Utrecht, March 15.
Dahlgren, Peter. 1995. Television and the public sphere: Citizenship, Democracy, And The Media. London: Sage.
Habermas, Jürgen. 1994. Three Normative Models Of Democracy. Constellations 1, no.1:1-10.
Habermas, Jürgen. 1992. Further Reflections On The Public Sphere. Trans. T. Burger. C. Calhoun (Ed.). Habermas And The Public Sphere, 421-461. Cambridge, MA: MIT Press.
Habermas, Jürgen. 1992/1996. Between Facts And Norms. Trans. W. Rehg. Cambridge, MA: MIT Press.
Friedland, Lewis A., Thomas Hove, and Hernando Rojas. 2006. The Networked Public Sphere. Javnost The Public 13, no.4:5-26.
Jarret, Kylie. 2008. Interactivity is Evil! A critical investigation of Web 2.0. First Monday 13, no. 3. (March). http://journals.uic.edu/fm/article/view/2140/1947
Jenkins, Henry. 1991. Textual poachers. Television fans and participatory culture. New York: Routledge.
Jenkins, Henry. 2002. Interactive audiences? The collective intelligence of media fans. The new media book. Dan Harries (Ed.). London: BFI. http://web.mit.edu/21fms/www/faculty/henry3/collective%20intelligence.html
Jenkins, Henry. 2006a. Fans, bloggers, and gamers: Exploring participatory culture. New York: NYU Press.
Jenkins, Henry. 2006b. Convergence culture. Where old and new media collide. New York: NYU Press. Jenkins, Henry et al. 2006. Confronting the challenges of participatory culture: media education for the 21st century. MacArthur Foundation. http://www.digitallearning.macfound.org/atf/cf/% 7B7E45C7E0-A3E0-4B89-AC9C-E807E1B0AE4E%7D/JENKINS_WHITE_PAPER.PDF
Jenkins, Henry, and David Thorburn. 2003. Democracy and New Media. Cambridge: MIT Press.
References
36
Lessig, Lawrence. 2000. Code and other laws of cyberspace. New York: Basic Books.
Lévy, Pierre. 1997. Collective Intelligence: Mankind’s Emerging World in Cyberspace. Cambridge, MA: Perseus Books.
Lister, Martin et al. 2009. New Media: A Critical Introduction. London: Routledge.
Lyotard, Jean-François. 1984. The Postmodern Condition. Manchester. Manchester University Press.
Neil Postman 1992. Technopoly: the surrender of culture to technology. New York: Knopf.
O’Reilly, Tim. 2005. What Is Web 2.0: Design Patterns and Business Models for the Next Generation of Software. O’Reilly Net. http://oreilly.com/web2/archive/what-is-web-20.html
Poster, Mark. 1997. Cyberdemocracy: The Internet And The Public Sphere. Virtual Politics. Ed. David Holmes. California, CA: Sage, Thousand Oaks.
Schäfer, Mirko T. 2011. Bastard Culture! How User Participation Transforms Cultural Production. Amsterdam: Amsterdam University Press.
Scholz, Trebor. 2008. Market Ideology and the Myths of Web 2.0. First Monday 13, no. 3. (March). http://journals.uic.edu/fm/article/view/2138/1945
Senft, Theresa. 2000. Baud girls and cargo cults. The World Wide Web and Contemporary Cultural Theory. Eds. A. Herman and T. Swiss. London: Routledge.
Webster, Frank. 2002. Theories Of The Information Society. London: Routledge.
Zimmer, Michael. 2008. Preface: Critical Perspectives on Web 2.0. First Monday 13, no. 3. (March). http://journals.uic.edu/fm/article/view/2137/1943
37
This paper aims to take a closer look at the issue of user control online, through the
prism of anonymity and responsibility. It does so by examining the recent events
which were part of “operation payback”, initiated by the online organization/col-
lective/gathering “Anonymous”. In response to several companies’ perceived trans-
gressions, Anonymous has commenced an attack on their public domain servers.
In addition, they provided sympathetic web users with a free and easy to use DDoS
tool to facilitate the attacks. This paper draws the distinction between Anonymous
and hackers, and compares the relation of the two groups with the powers that be.
The aim is to show how multitude of varying factors have led to increased resist-
ance to Anonymous because they are not hackers, and how they in fact may con-
tribute to limiting user control online rather than empower it.
Keywords: “Anonymous”, online anonymity, hackers, user control, smart mobs,
DDoS
Abstract
38
“Anonymous is infinity divided by 0.= Syntax error”. Encyclopedia Dra-
matica, “Anonymous”.
Tracing Anonymous
As the Internet gained dominance as a part of the everyday physical world, rather than
an alternative to it, so did increase the governmental oversight of the online sphere.
Rather than a disconnected cyber-space of disembodied personas, as portrayed in
early days, the internet - and more so the web - becomes a crucial component of com-
merce, governance and media. Power structures around the world respond to this
growing importance by reducing the tolerable margins of “devious” internet behavior.
This is done via particular legislation and oversight implemented by governmental
and regulatory structures. This paper aims to describe this phenomenon through a
particular case study of the online collective/ gathering/ organization “Anonymous”.
By using the methodology of Actor-Network Theory (from now on – ANT) I will ex-
amine Anonymous and the influence they exert on user control online.
The ANT perspective (Latour 1987; Latour 2008; Law 1992) argues that in order to
understand modern society, a researcher must follow the work-nets of human and
non-human actors (or rather, actants) through cultural-material artifacts. We can
thus facilitate meanings by tracing and relating the different actors one to another.
One must discard theoretical constructs (such as ideologies) which are invisible and
thus irrelevant to the actors in the system. Thus it is possible to locate the underlying
currents in the decision making processes of specific endeavors, and learn about the
constructions of symbiotic human-technological relations in society.
To analyze Anonymous from ANT methodology we must take into account the so-
ciological and anthropological perspectives on the origins of the organization; their
places of gathering and methods of communications (which both rely to a great extent
on technical means); the current interplay of commercial, political and private actors
that operate within the web; as well the specifics of recent changes in the attitudes
toward cyberspace deviation from nation-states and corporations, as reflected in lob-
bying and the legislation following it.
Another methodological note must be made here. ANT theory has originated in STS
Legionnaires of Chaos: “Anonymous” and governmental oversight of the Internet.
39
Alex Gekker
and organizational studies. The methodology was developed in laboratories, plants
and offices, with strong anthropological overtones. It suggests participant observation
as one of the main tools, or at least the ability to interview the actors in question in
order to trace their action. It is more of a sociological ‘field’ study that requires direct
interaction with the objects of inquiry rather than an office analysis of data. Anony-
mous is problematic in the sense, that you have nothing to question, no field to enter
for inquiry. As will be discussed further on, the group lacks formal representatives
and membership. Furthermore, as recent inner chat logs of Anonymous operation
have disclosed (Cook and Chen 2011), the members have fondness for purposely mis-
leading anyone who try to gather insights from their gathering1. In my researched I
followed a methodology described by Roversi (2008) in his study of hate groups of the
net: inside observation of the openly available sections while applying analysis based
on other sources and current political affairs.
This paper will first offer a brief discussion of Hackers and their traditional role online,
while suggesting that despite their public image, governments and corporations have
in fact enjoyed a cordial relation with this sub-culture over the years. I will proceed to
describe Anonymous first as a web collective and then as a quasi-political organiza-
tion. I will show how despite their reputation, Anonymous don’t constitute hackers in
the traditional sense of the word, and how this fact underlines their relations with the
authorities. I will proceed to discuss how the convergences of the offline world with
the online, together with the discussed unique characteristics of Anonymous, position
them as a threat in the eyes of governments worldwide. My aim is to show how multi-
tude of varying factors have led to increased resistance to Anonymous because they
are not hackers, and how they in fact may contribute to limiting user control online
rather than empower it.
Hacker culture
Anonymous are not hackers. At least, that’s how they think of themselves2, or the im-1 In one instance, per a journalist’s request to gain access to “inner circles” of the group, mem-bers discuss amongst themselves the possibility of creating fake IRC channel with bogus code-words and displaying it to her.2 There are several examples of how Anonymous don’t consider themselves hackers, although some individuals and perhaps even leaders (as much as the term applies) within this collective, exhibit hacker characteristics. One is a press release (ANONYMOUS 2010) originated from the group which states: “Anonymous is not a group of hackers. We are average Internet Citizens ourselves and our moti-vation is a collective sense of being fed up with all the minor and major injustices we witness every day”. Another example comes from Encyclopedia Dramatica, one of the chief online collaborative forums associated with the group: “Anonymous can be anyone from well-meaning college kids with highly id-iosyncratic senses of humor trying to save people from Scientology, to devious nihilist hackers, to clever nerds, to thirteen year old boys who speak entirely in in-jokes on an endless quest for porn…” (Encyclo-
40
age they try to project. While a detailed account of what Anonymous are will follow,
we must also consider what they are not. It is easy to pin the characteristics of this
collective as based in “Hacker Culture”, but that alone wouldn’t help as do understand
better the motivations. In Latourian terms, hacker culture is an intermediary rather
than a mediator – a collective term for motivations and practices of quite a large and
diverse group. In order to understand what Anonymous are I will first trace down
the origins and the meanings of the term, and then show how Anonymous fails to fit
within this form.
The Internet always had a place for libertarian individuals who used their technical
skills to bend and break rules. Sociologist Manuel Castells (2001) argues that hackers
are one of the pillars of modern web culture. Unlimited access to information, disdain
for authority and the desire to prove intellectual capability are the paramount ideals
that drive this unique sub culture. Yet hacker culture has originated before the web.
Some of the Internet’s most popular applications, from email and to the web itself,
were created by individuals following their curiosity and working in personal techno-
logical projects rather than on what they were supposed to (Castells 2001; Barabási
2002). The culture is about discovery and innovation albeit not in the formal way.
Technology researcher and critic Howard Rheingold, a central figure in one of the first
counter-cultural digital bulletin boards, WELL (“Whole Earth ‘Lectronic Link”, which
predated the WWW by a couple of years) goes as far as to claim that those libertarian
values are imbued in the technological understructure of the Internet. He quotes his
WELL colleague and the founder of the Electronic Frontiers Foundation, John Gilm-
ore, and explains that: “The net interprets censorship as damage, and routes around
it” (Rheingold 2000, xxii). While technically inaccurate (censorship is difficult, but
possible) these attitudes show how much the libertarianism of the hacker culture is
perceived to be imbued in modern online life.
Governments and corporation have tolerated this culture, because despite its infor-
mality and the tendency for insubordination, those talented tinkerers have gener-
ated real value. Castells (ibid) suggests that the predominant hacker culture that was
spread in US and UK universities and similar facilities was largely responsible to the
western technological advantage in the cold war. Soviet researchers, despite scien-
tific excellence, were too stuck in political oversight and efficiency plans to commence
the sporadic breakthroughs in computer sciences and electronics that characterized
the west. Hackers were instrumental to the construction of the network society (van
pedia Dramatica)
41
Dijk 2005) by providing the technological distributed networks that allowed CMC to
substitute face to face communications in personal and business aspect, while bring-
ing some of their own free spirited culture into those networks. But they were never
counter-cultural in the literal sense of the word.
Hackers are by definition more interested in the development and spread of technol-
ogy rather than in its social context. By observing modern day Silicon Valley giants,
which originated in the early days of the net, one can see how corporations were start-
ed, alliances were formed and positions of power accepted. As Wayner(2000) notes
in his book on free software, hacker culture is primarily technological, rather than
ideological. One does not stop being a hacker by starting a corporate job or getting
elected to governmental positions. Notorious hackers can become board members
of ICANN (Castells 2001, 32) or becoming the chief scientist for designing the next
generation of internet for US government (Lanier). This distinction is important when
we’ll later consider members of Anonymous collective, who base their identity around
the content the produce and consume, or around their political mobilization rather
than around proficiency with technology.
We Are Legion: Anonymous in the Making
It is difficult to begin and describe how Anonymous came to be no less than to try and
pinpoint what they are. They appear to have originated from several highly idiosyn-
cratic web forums, IRC channels and websites, dedicated to web culture. The online
venues associated with Anonymous are primarily 4chan.org image board forum (and
especially, the /b/ “random” board of it) and Encyclopedia Dramatica, the online cul-
ture anti-thesis to Wikipedia, a wiki devoted to internet memes, provocative language
and shocking images (Elliott 2008). Their ethos defies definitions such as “group”
or “organization”. They claim to lack formal organization or leadership3.As a press
release they issued stated:
Anonymous is not a group, but rather an Internet gathering. Both Anonymous
and the media that is covering it are aware of the perceived dissent between
individuals in the gathering. This does not, however, mean that the command
structure of Anonymous is failing for a simple reason: Anonymous has a very
loose and decentralized command structure that operates on ideas rather than
directives.(ANONYMOUS 2010, 1)
3 Leaked protocols of online meetings have later shown that this is not completely true, and some leadership is allocated (or assumed) for specific tasks. I will deliberate on this in the next part.
42
Anonymous’ origins can be tracked down in the days before web 2.0 neo-liberalist cul-
ture (O’Reilly 2005; Jarrett 2008), when the web was still inhabited by mostly disem-
bodied and nameless entities. Before Google suggested to individuals to open email
under their own names, before Amazon connected one’s shopping habits to his credit
card account and before Facebook forbade the use of fake names in the creation of
profiles, the code of conduct of the web perceived anonymity as norm.
However, the anonymity of Anonymous is not akin the classic state of web 1.0, where
it meant the ability to maintain consistent alias or persona without having to identify
your credentials in physical space. As blogger and social media critic Jana Herwig
(Forthcoming) notes in her account of 4chan4 (ibid):
“While conventional anonymity online meant that one’s real name and iden-
tity were protected through the use of (unique and/or registered) nicknames,
4chan takes this one step further: Because no one can register, no one may
claim a nickname for him or herself”.
This is an important observation. Anonymous is not rooted in ideology of anonym-
ity per se but rather in one which promotes lack of identity. On 4chan, unlike social
networking sites or forums, there are no permanent identities (true or otherwise), no
“social graphs” for friends/ follower or “feeds” of content connected to your persona.
Each post on the image board has a unique identifier, but this 9-digit code is the only
reference possible on the website. Posts are deleted after a period of hours or days
(depending on popularity) and no archival record remains on the site. Although users
posting or replying to a thread might assume an alias in the process of composing, this
is not required and is in fact discouraged. The effect is endless boards of images and
text, coming from predominantly “Anonymous” (non-named) posters. As Galloway
(2004) would note, this is an example of protocol shaping social interaction. Without
a means to distinguish between the users on the board, the associated feeling is of a
huge hive-mind communicating with itself; a single, yet heterogeneous organism, or
perhaps a schizophrenic arguing with multiple personalities.
This notion is indicated in Anonymous unofficial motto: “we are legion”. Taken from
the New Testament, this quote references the submergences of multiple identities in
one entity, existing as whole but disappearing when trying to pinpoint individuals.
4 Despite its idiosyncratic content, it attracts about 9.5 million unique users monthly (Herwig Forthcoming)
43
Unlike the hacker culture previously discussed, Anon culture seeks no recognition,
intellectual or otherwise.
“Anonymous is not a person, nor is it a group, movement or cause: Anony-
mous is a collective of people with too much time on their hands, a commune
of human thought and useless imagery. A gathering of sheep and fools, ass-
holes and trolls, and normal everyday netizens. An anonymous collective, left
to its own devices, quickly builds its own society out of rage and hate… As
individuals, they can be intelligent, rational, emotional and empathetic. As
a mass, a group, they are devoid of humanity and mercy. Never before in the
history of humanity has there once been such a morass, a terrible network of
the peer-pressure that forces people to become one, become evil. Welcome to
the soulless mass of blunt immorality known only as the Internet” (Encyclo-
pedia Dramatica)
Another aspect of Anonymous culture is its apparent nihilism. The main reason for
the collective to set into action is “lulz” (Bair 2008) – continuous search for entertain-
ment through the pursuit of the awkward, bizarre and unconventional, often at the
expense of others. Derived from the infamous web abbreviation for “Laughing Out
Loud”, lulz are the reason for invading en mass another forum for relentless spam
comments (“trolling”) or for launching a world-wide protest against scientology. The
goal may be righteous or not, the targets may “commit crimes” against Anonymous
or just be on the wrong server at the wrong time – if it provides entertainment, it is
worth doing.
The last moment in anonymous culture worth exploring is the way meanings are gen-
erated through the inception of memes. Memes (Dawkins 1976) is the smallest unit of
cultural information; it can be an idea, a fashion or an architectural style. Online the
word came to represent a joke, a phenomenon or a catchphrase. Anonymous thrives
on memes and 4chan is considered to be a central “meme-factory” for the rest of the
web (Herwig Forthcoming). One such example is “lolcats” (pasting badly misspelled
comments on top of animal pictures) which originated on the boards and became an
internet phenomenon.
What’s interesting about memes, especially in the early stages of their origination and
insemination, is the fact they are truly evolve via natural selection. There is no demo-
cratic process, voting mechanisms or leaders who say what a meme is and what is not.
In fact, since 4chan lacks archival memory and the only way to persevere content from
44
the website is by copy-pasting and saving it onto individual hard drives, memes are
prone to oblivion. Only by crossing a certain invisible threshold of acceptance, does a
meme continue to leave. It is a unique process, which may reflect something of Jodi
Dean’s (2003, 108) “neodemocracies” where consensus is achieved through struggle
and contestation between opposing views.
To sum up, Anonymous is not only anonymous, but also in many ways identity-less.
Its ethos is somewhat nihilistic, including self-derogatory rhetoric and action based
on fun factor rather than specific values or agendas (except, perhaps, the value of
online anonymity). And lastly ideas, agendas and motivations compete within the
gathering for dominance, and when one “infects” the critical mass of brains required,
the collective as a whole decides to act upon it. This factor is crucial when discussing
Anonymous as political mobilization.
Politically Active
First time Anonymous climbed from the (relative) obscurity of the internet and into
the headlines was due to its involvement with the Church of Scientology. Enraged by
the church / sect’s attempt to remove a leaked internal video, and interpreting it as
a violation of “the laws of the internet”, Anonymous movement decides to fight back.
The struggle includes both online hacktivism against Scientology’s websites and of-
fline demonstrations in which small masked groups of Anonymous members disrupt
scientologists in hundreds locations around the globe5 (Anon. 2008; Elliott 2008;
Bair 2008). The scope of this paper cannot cover the entire affair, and a detailed ac-
count can be read in any of aforementioned sources. The name that emerged for this
anti-scientology campaign was “Project Chanology”, a portmanteau of “scientology”
and “4chan”.
In the case of project Chanology one should note how the authorities treated those
outbursts against Scientologists: mostly by ignoring them. Several demonstrations
were dispersed, but generally local and state authorities declined to intervene in what
appeared to be a conflict between two sub-cultures (Arnoldy 2008). Furthermore,
scientologists “fair game” approach (Urban 2006) openly declared the intent to per-
secute those who oppose it via physical, juridicial and PR means. Anonymous’ masked
activists protest left Scientologists without real names or faces to target. Operation
Chanology, to take it from previously discussed perspective, was very “Hackerish”:
5 Anonymous modus operandi was in organizing quick, distributed, cell-based flash attacks, best described by Howard Rheingold concept of smart mobs (Rheingold 2002)
45
creative, even if somewhat rogue, solution to a problem that cannot be tackled by oth-
er means, and which comes to pass in peripheral, non-crucial field. To put it bluntly:
both Scientology and Anonymous were too far away from the mainstream interests
for paying attention.
This was not the case with “Operation Payback”. It began as anti copyright campaign
targeted against organizations persecuting pro-piracy activists, but then was re-tar-
geted as an online “artillery support” for Julian Assange and the Wikileaks organiza-
tion (Correll 2010). Wikileaks, which released hundreds of thousands classified US
documents online in the preceding month, was being classified as a criminal organiza-
tion by the US and several European states. Following that, several large companies
such as Amazon and Visa withdrew any dealings with Wikileaks and froze their ac-
count. Anonymous responded by a call tallying it’s supporters to “avenge Assange”6
by actively propagating Wikileaks’ cause and by participating in DDoS7 attacks on the
offending companies. The attacks were carried out by a web and software based tool
named LOIC which allowed anyone to join in the assault without previous technical
skills (Pras et al. 2010).
Later on, Anonymous has hacked the accounts of Internet security firm HBGarry and
produced compromising emails in which the company supposedly planned cyber and
smear attacks against Wikileaks (Cook and Chen 2011). In contrast to earlier opera-
tions, this time Anonymous targeted rather mainstream organizations and corpora-
tions. The distributed, meme-based decision making process proved to be effective in
dealing with real-time current affairs situation. The development of technology has
allowed everyone who wished to participate in cyber-attacks, eliminating the previ-
ously needed expertise-based “hacker” mantle. The reaction of authorities this time
was strikingly different. Several activists were tracked and arrested (Cook and Chen
2011) and the US FBI began an investigation of the attacks (Sandoval 2010). In the
next and final part, I will discuss the premises that lead to this change.
Governments
I would like to claim that the main reason for a change in the behavior of governments
lies in the change of the web’s role in modern society (Chadwick 2008). Web-based
6 The complete text of the message can be seen here: http://en.wikipedia.org/wiki/File:Avenge_Assange_Anonymous.png7 Distributed Denial of Service attacks are simple method for disrupting access to a specific website by bombarding it in server requests send from multiple computers, thus leaving the server un-able to deal with incoming traffic.
46
giants try to assault online anonymity to achieve better segmentation of their users
and turn profits. To do so, they seek governmental support “in order to preserve their
property rights in the internet-based economy” (Castells 2001, 181). This economy
demands active participation of users in content generating and sharing platforms of
web 2.0 (Scholz 2008). Anonymous and their activists demonstrate that this online
participatory power can be used for disruption as well as for consumption.
Recent legislation shows that both the US and the EU begin to take cyber-space quite
seriously. Laws are being drafted to regulate cyber-security, cyber-crime and one’s
management of identity online. Over 50 pieces of legislation have been discussed by
the US congress over the last two years (Hathaway 2010). Recent legislation includes
acts like the Cybersecurity and Internet Freedom Act (Sen Lieberman 2011) which
regulates the responsibility for ICT crimes and attacks in the US and proposes, among
other things, a “kill switch” for the president that allows cutting access to the net in-
frastructure on massive scale. Another example is the Cyber Security and American
Cyber Competitiveness Act (Sen Reid 2011) which also deals with the issues of privacy
and regulates economic entities (albeit guarantying that no restrictions would be put
on the ability of federal bureaus in accessing this information).
Outside the US the situation is not different. The EU has commenced the creation Eu-
ropean Network and Information Security Agency (EU 2004), which has been ratified
slowly but surely in member countries, giving more and more power to governments
over cyberspace. Russia is taking control of the digital world even more seriously. A
recent example is a statement by the Russian Head of the FSS (the successor of the no-
torious KGB) in which he suggested to forbid inside Russia services which use internal
encryption, such as Gmail, Hotmail and Skype (Faulconbridge 2011)8.
Alt-Control
It would be far too presumptuous to suggest that Anonymous alone are responsible
for this trend in governmental legislation. But they are a very visible instance of a
larger problem for the powers that be. As the web becomes further entangled with
everyday practices of commerce and politics on the one hand, and average citizens
gain disruptive collective powers and techonology (such as the LOIC DDoS tool) on
the other – governments rightfully fear loss of control from the hands of various smart
8 The paper’s scope is far too narrow to include all possible examples on Cyber legislation from recent years, and just the discussion of steps taken in more authoritative regimes (China in particular) may constitute a paper of its own. These examples are meant to show the increasingly active role gov-ernments try to assume in Cyberspace, as well as their attempt to expropriate the control over online life from private users and commercial entities.
47
mobs. Their response is to try and return this control to themselves, buy limiting it on
others.
Unlike in the case of “Hacker Culture”, when the quirks of a handful of skilled maver-
icks could be tolerated in exchange for the potential benefits of their talents, Anony-
mous are not hackers. Though some of them without doubt display above average
skills in computer and network systems, they do not seek personal recognition and
their end goal is (anti)social upheaval rather than improvement of existing technol-
ogy. They are, as they claim, a decentralized hive-mind structure. They are neither
a group, nor organization nor cause. Self-proclaimed “Gathering” is indeed a fitting
name. A better name, in van Dijk’s terms might simply be “a network”. Or, in ANT
terms: they just refuse punctualization.
Anonymous may see itself as operating for the greater good of the average user. But
on the playground of modern web, their actions serve to emphasize how ordinary
netizens have become the problem that hackers never were. The efforts perpetuated
by Anonymous, especially under the chaotic, counter-cultural shroud they exhibit to-
day, may in fact lead to the strengthening of online governmental control, rather than
helping the fight to resist it.
48
Anon. 2008. The following post is [about] anonymous. Confessions of an Aca/Fan. April 3. http://henryjenkins.org/2008/04/anon.html.ANONYMOUS. 2010. ANON OPS: A Press Release. October 10. http://www.wired.com/images_blogs/threatlevel/2010/12/ANONOPS_The_Press_Release.pdf.
Arnoldy, Ben. 2008. “Anonymous activists gaining strength online.” CSMonitor.com, March 17. http://www.csmonitor.com/USA/Society/2008/0317/p03s02-ussc.html.
Bair, Alex. 2008. “‘We are Legion’: An Anthropological Perspective on Anonymous.” Proceedings of the 2008 Senior Symposium in Anthropology. Department of Anthropology Idaho State University (April 30). http://www.isu.edu/~holmrich/senior_symposium/seniors2008.pdf#page=47.
Barabási, Albert-László. 2002. Linked: the new science of networks. Cambridge, MA: Perseus Pub.
Castells, Manuel. 2001. The Internet Galaxy: Reflections on the Internet, Business and Society. New York: Oxford University Press.
Chadwick, Andrew. 2008. “Web 2.0: New Challenges for the Study of E-Democracy in Era of Informational Exuberance.” ISJLP 5: 9.
Cook, John, and Adrian Chen. 2011. “Inside Anonymous’ Secret War Room.” Gawker, March 18. http://gawker.com/#!5783173/inside-anonymous-secret-war-room.
Correll, Sean-Paul. 2010. Operation:Payback broadens to “Operation Avenge Assange.” [[Panda Security]]. December 6. http://pandalabs.pandasecurity.com/operationpayback-broadens-to-operation-avenge-assange.
Dawkins, Richard. 1976. The Selfish Gene. New York: Oxford University Press.
Dean, Jodi. 2003. “Why the Net is not a Public Sphere.” Constellations 10 (1): 95-112. doi:10.1111/1467-8675.00315. http://onlinelibrary.wiley.com/doi/10.1111/1467-8675.00315/abstract.
van Dijk, Jan. 2005. The Network Society: Social Aspects of New Media. 2nd ed. London: Sage Publications, November 5.
Elliott, D. C. 2008. “ANONYMOUS RISING.” LiNQ: 96. http://www.linq.org.au/idc/groups/public/documents/ebook/jcuprd1_069515.pdf#page=96.Encyclopedia Dramatica. Anonymous. Encyclopedia Dramatica. http:///www.encyclopediadramatica.com/Anonymous.
References
49
EU. 2004. Regulation (EC) No 460/2004 of the European Parliament and of the Council of 10 March 2004 establishing the European Network and Information Security Agency. 460/2004. http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32004R0460:EN:NOT.
Faulconbridge, Guy. 2011. Russian spy agency complains about Gmail, Skype. Yahoo! News UK. April 8. http://uk.news.yahoo.com/22/20110408/tpl-oukin-uk-russia-internet-553508c.html.
Galloway, Alexnder R. 2004. Protocol: how control exists after decentralization. Cambridge, MA: MIT press.
Hathaway, Melissa. 2010. Cybersecurity: The U.S. Legislative Agenda Part II November. http://belfercenter.ksg.harvard.edu/files/short-summary-legislation-nov2010.pdf.
Herwig, Jana. Forthcoming. “The Archive as the Repertoire Mediated and Embodied Practice on Imageboard 4chan. org.” Minds and Matter: Paraflows 10 Symposius. http://homepage.univie.ac.at/jana.herwig/PDF/Herwig_Jana_4chan_Archive_Repertoire_2011.pdf.
Jarrett, K. 2008. “Interactivity is Evil! A critical investigation of Web 2.0.” First Monday 13 (3): 34–41.
Lanier, Jaron. Jaron Lanier’s Bio. http://www.jaronlanier.com/general.html.
Latour, Bruno. 1987. Science in action. Cambridge, MA: Harvard University Press.Latour, Bruno. 2008. Reassembling the social: an introduction to actor-network-theory. Oxford: Oxford University Press, June 19.
Law, John. 1992. “Notes on the theory of the actor-network: Ordering, strategy, and heterogeneity.” Systems Practice 5 (4): 379-393. doi:10.1007/BF01059830. http://www.springerlink.com.proxy.library.uu.nl/content/k3250570778u53u5/.
O’Reilly, Tim. 2005. “What is Web 2.0: Design Patterns and Business Models for the next generation of software.” http://www. oreillynet. com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html.
Pras, Aaiko, Anna Sperotto, Giovane C.M. Moura, Idilio Drago, Rafael Barbosa, Ramin Sadre, Ricardo Schmidt, and Rick Hofstede. 2010. “Attacks by ‘Anonymous’ WikiLeaks Proponents not Anonymous.” Design and Analysis of Communication Systems Group (DACS) University of Twente, Enschede, The Netherlands (December 10). http://eprints.eemcs.utwente.nl/19151/01/2010-12-CTIT-TR.pdf.
50
Rheingold, Howard. 2000. The Virtual Community: Homesteading on the Electronic Frontier. Cambridge, MA: MIT Press, November 1.
Rheingold, Howard. 2002. Smart Mobs: The Next Social Revolution. Cambridge, MA: Perseus Pub., October.
Roversi, Antonio. 2008. Hate on the net: extremist sites, neo-fascism on-line, electronic jihad. Bodmin: Ashgate Publishing, Ltd.
Sandoval, Greg. 2010. FBI probes 4chan’s “Anonymous” DDoS attacks | Media Maverick - CNET News. CNET News. November 9. http://news.cnet.com/8301-31001_3-20022264-261.html.
Scholz, Trebor. 2008. “Market Ideology and the Myths of Web 2.0.” First Monday 13 (3): 3. http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/viewArticle/2138/1945.
Sen Lieberman, Joseph I. 2011. S.413 - Cybersecurity and Internet Freedom Act of 2011. February 17. http://thomas.loc.gov/cgi-bin/bdquery/D?d112:5:./temp/~bdtPZW::|/home/LegislativeData.php|.
Sen Reid, Harry. 2011. S.21 - Cyber Security and American Cyber Competitiveness Act of 2011. http://thomas.loc.gov/cgi-bin/bdquery/D?d112:4:./temp/~bdtPZW:@@@L&summ2=m&|/home/LegislativeData.php|.
Urban, Hugh B. 2006. “Fair Game: Secrecy, Security, and the Church of Scientology in Cold War America.” Journal of the American Academy of Religion 74 (2) (June): 356–389. doi:10.1093/jaarel/lfj084.
Wayner, P. 2000. Free for all: How Linux and the free software movement undercut the high-tech titans. Harper business. http://www.jus.uio.no/sisu/free_for_all.peter_wayner/landscape.a5.pdf.
51
In the contemporary architecture of the Internet users can no longer live in perfect
anonymity. Data is collected in log in forms where users consciously disclose their
information, as well as by tracing mechanisms known as cookies. The visibility
of the cookie depends on the specific governmental policies which determine the
conditions under which it can act. In this paper, I will explore the cookie from the
perspective of the actor-network theory and will examine the network of actors
that it activates. Opening of the cookie’s black box aims to present a piece of the
logic embedded in the contemporary internet architecture which currently exists
in the United States. In this article I argue that the cookie is a carrier of market and
political ideologies that aims to take the users out of the obscurity in order to get
control over their personality.
Keywords: contol, resistance, regulation, data protection, market interests
Abstract
52
Introduction
In the XXI century, Western countries are enjoying a culture of freedom in Inter-net. We are now used to get free access, free music, and free software. This culture emerged after a period of Cold War and the dominating totalitarianism in Eastern Europe which marked the World with fear of governmental agents and spies, tracing state authorities and regular people. After the reformations, the Internet manifested that a new type of society is emerging. The new space of the Internet was promising freedom of regulations, authorities and interference of the users. The Internet of that time was caring the values of the freedom and as John Perry Barlow manifested, the Internet was “the new home of Mind” (Barlow 1996) where no external power allowed existing. But this architecture has changed. The new media lawyer Lawrence Lessig (Lessig 2006) argues that the original design of the Internet is now transformed and controlled by an “invisible hand, pushed by government and commerce” (Lessig 2006, 4). The potential for control lies in the architecture of the Internet. As the new media scholar Alexander Galloway (Galloway 2004) argues that “the founding principle of the Net is control”, rather than freedom and “control has existed from the beginning” (Galloway 2004, XV). But this type of control is different than the control opposed in the mid. twentieth century. – it is released through “openness, inclusion, universal-ism, and flexibility” (Galloway 2004, 142). The internet offers the users both freedom and control of the users. According to the contemporary media scholar Wendy Chun (Chun 2006), now-a-days Internet allows us to experience not an absolute freedom but “forms of freedom” (Chun 2006, 3), deriving from the tools that forbid us to have the entire control over our actions. In this paper, I argue that this type of control was made possible through specific commercial and governmental interests that aim to take the users out of obscurity, and striving to gain control over their personality. In order to present this phenomenon, I will examine a small piece of data, passed be-tween the actors on the playground - the http cookie.
Method of analysis
The analysis of the cookie will be based on the Actor-Network-Theory (ANT). While the social network analysis examines the patterns in the society by focusing on the relations between people or groups (Wellman and Marin 2010; Knox 2010), the ANT looks beyond the human interactions and emphasises the role of the non-humans as an essential part of the social structure (Latour 2005; Law 1992; Akrich 1994). The ANT scholars do not determine social actors in terms of humans or non-humans, ma-terial or symbolic, but they judge them on their capability to transform and to “make
Cookies and the mindset of control
53
Kalina Dancheva
a difference” (Latour 2005, 154). In the ANT perspective the cookie is a technical device that “has a script” and an “affordance” (Latour 1994, 31), which are inscripted by designers and show the potential of artefact to act. The cookie is an actor, which can be described as a “quasi-object” (Latour 1993, 51) that travels within the network of relations. Although they are passed between the actors, quasi-objects are not inter-mediaries that only transport meaning but they are actors, or “actants” (ibid, 54), that can transform social relations. Moreover, exploring the cookie from actor-network theory perspective means to recognize the role of the “hyphen” because it “deploy[s] actors as networks of mediations” (Latour 2005, 136). In this sense, the cookie is a network itself which mobilizes a chain of different actors. In order to explore this net-work, we need to open the black box of the cookie and trace the actors that it activates. For the purposes of the paper, I will focus on the user control in the USA by following the actors: DoubleClick (advertising company), the Electronic Privacy Information Center (EPIC), and the Federal Trade Commission in the United States. By focusing on these actors, the article aims to reveal a piece of the logic embedded in design of the contemporary Internet playground and to explore how it defines the user control.
HTTP Cookie
These cookies aren’t tasty
USA Today (2010)
HTTP cookie or magic cookie is a programming term that refers to a ““piece of ‘trans-action state’ (connection) information left on the client before the HTTP transaction cycle is concluded” (Berghel, 2001, 20). It is also described as an “opaque identifier” (Raymond 1996). Cookies were firstly introduced by the Netscape company as a tool for e-commerce that would facilitate the user when shopping online by saving the information of the selected items and creating a “shopping cart” (Hormozi 2005, 51). In this sense, the arrival of the cookies is connected to the commercialization of the Internet as they were primary designed for market purposes. The cookie functions as “state management mechanism” (Kristol 2001, 151) which allows a web site to identify and track user’s behaviour. The role of the cookie is to overcome the limitations of the basic protocol design, which by default is “state-less” (Lessig 2006, 47), meaning that if no additional applications take place, a web site cannot identify a user preceding the current request. The state information of the cookie includes URL addresses and when a request is made to these URLs, the browser will send this information to cor-responding servers. The information, tracked by the cookie is stored on the user’s hard disk in the form of text files that contain “name-value pairs” (Brain 2000). The name of the pair as unique identification (ID) which is given to the cookie and it matches the same ID on the server. So when a user returns to the web site where he got the cookie
54
from, the server recognizes the user and the data collected by the cookie is “loaded in a database” (McCarthy and Yates 2010, 233) located on the server. After the cookie is left on the user’s computer, it can then perform its main function – to track data. The cookie is designed to retain variety of data such as IP address, shopping cart items, se-lected preferences, serial numbers, information left in registration forms, frequencies of contact with companies (McCarthy and Yates 2010). It can also collect user domain which is important because it can contain information about physical location and username (Chun 2006, 3). By connecting the users’ requests the cookie collects the information and stores their behaviour.
Cookies possess several essential properties. They are “encoded or encrypted” (Berghel 2002, 24), meaning even if the users is aware of the cookies, one cannot understand its content. This content is only determined by the server and cannot be changed by the user. Cookies can record “clickstream information” (ibid) which traces browsing behaviour of the user on a particular web site but they can also be programmed to be shared by “third-party Web hosts” (ibid), known as third-party cookies. These types of cookies are attached to a loaded image, for instance a web banner, which is embedded in a web site. In this case the user not only can get a cookie from the web site, but can receive such from another domain name embedded in the image for which the user is not aware of. The third party cookies are used by predominantly by advertises. The ef-fect of these cookie is that the users gets the cookie from the advertiser in a visited web site but if they visit another web site which contains advertising from the same server, the cookie ID matches the server’ ID and sends the information to the server. By this way, the server collects the information generated by the user in different web sites in large profile databases. Furthermore, the user cannot have the control over the des-tination of the cookie (cookie leakage) and cannot allow or forbid the cookie to store any desired or not desired type of information. Moreover, browsers can also receive cookies from sites which are not visited by the users without their knowledge. Finally, the cookie can be classified as “session or persistent” (ibid). Session cookies collect information until the particular user’s session takes place and disappear after that, while persistent cookies remain on the user’s hard drive with a set expiration date.
To sum up, the properties of the cookies determine that users cannot modify the cook-ie, neither to understand what type of information it collects and where this informa-tion is stored. If the web site does not have an opt-in feature1 for cookies, the users will not be aware that their computers have cookies. In the case of third party cookies, the user can get these bits of data from an unvisited domain.
1 Opt-in mechanism informs the user about the use of cookies before it is sent to the browser and allows the user to allow or forbid the cookie.
55
Cookie’s black box
In this paper, I will explore the role of several actors on the U.S. market. I’d like to argue that they illustrate the logic on which the cookie exists and dominate in the con-temporary architecture of the Internet.
Double Click
Since the cookies were introduced as a tool, advertisers play a major role for their pro-liferation. Such a company is the online advertising network DoubleClick. The com-pany was using third-party cookies since the early years after they were introduced by Netscape. For DoubleClick, cookies were enabling the market of behaviourally tar-geted advertising. The company used third-party cookies in order to segment their audience and to deliver more effective web ads, based on interests that the users were providing in the network of web sites which supported ads from DoubleClick. By 1998 the company was placing banners in highly traffic web site such as the Washington Post and the New York Daily News and it was reported to have collected about 100 million Internet profiles (Jason Williams 2000). As the techno-journalist Catherine Holahan says, “advertisers have a sweet tooth for cookies” (Holahan 2006). Targeted advertising was boosting the revenue of the company so that in 2000 it acquired the direct-marketing Abacus Direct for $1.7 billion (Jason Williams 2000) and estab-lished as a leader on the market. In this sense, we can state that cookie’s mindset is embedded with market driven logic as it serves as an engine for gaining shares and profit. However, the activities of the DoubleClick were also closely monitored and in-fluenced by non-governmental and State actors.
Electronic Privacy Information Center
The use of cookies by commercial entities mobilizes non-governmental actors which monitor their activities in terms of privacy protection and alarm the society in case of violation. In the U.S. such an organization is the Electronic Privacy Information Center (EPIC) which is focused on monitoring issues regarding civil liberties and pri-vacy protection. It aims to provide the control of the user’s information in the hands of the uses themselves. In 2000 EPIC examined the work of DoubleClick and it was the first to alarm the public about violating practices for data collection (Hormozi 2005, 51). EPIC filed a complaint in which it argued that the company was engaged in “unfair and deceptive trade practices” (EPIC 2000, 1) that were creating detailed “na-tional marketing database” (ibid, 1) without the knowledge and consent of the users. EPIC argued that despite the fact that DoubleClick described the “opt-out2” practice of cookie in their web sites, users received the cookie from another web site and hence they were not familiar with the opt-out procedure. According to the Executive Direc-2 A set of practices or methods by which a user can remove a cookie
56
tor of EPIC, Marc Rotenburg, “this complaint against DoubleClick is a critical test of the current state of privacy protection in the United States” (Techno Journal 2000). The complaint provoked another actor, the Federal Trade Commission, to investigate the practices of DoubleClick and as a result, DoubleClick changed its cookie policy by explaining to the users how the company collected data and how they could opt-out (Hormozi 2005, 57). In this case, EPIC could influence a powerful business model and could defend the rights of the users. The case study also presents that resistance in the black box of the cookie exists through EPIC which strives to protect the rights of the consumers.
However, we need to mention that this success happened in a specific governmental context. In the same year the U.S government issued the memorandum of “Privacy Policies and Data Collection on Federal Web Sites” which prohibited the use of cookie on governmental web sites. In the following years, EPIC continued to monitor com-mercial and governmental agents and to provoke a public debate when the use of cookie was violating the rights of the users. In 2009, EPIC revealed a contract between the U.S. government and Google which allowed Google and YouTube to place third-party cookies on the White House web sites (Hsu and Kang 2009). EPIC argued that this contracts shows that the government “failed to protect the privacy rights of U.S. citizens” (ibid). But at this time, the government had already taken a different course towards the use of cookies. In 2009 the Federal agency of the USA issued a Proposal for Revision of the Policy on Web Tracking Technologies for Federal Web Sites which revisited the ban over the use of cookies. It allowed Federal web sites to use single session and persistent cookies with the reason that they will provide “better customer service” (Federal Register 2009, 37062) and “enhanced Web analytics” (ibid). In this case, although EPIC invoked a public debate on the issue of privacy, their actions could not lead to a change in the State policy. In this case, the resistance in the network was overcome. This case shows that currently EPIC and other non-governmental organi-zations act in a political environment which tolerates and justifies the use of cookies as a legal tool for collecting data.
Federal Trade Commission
The case study of EPIC presented that the U.S. government itself acknowledges the use of cookies on Federal Web sites. But how was the resistance overcome? At this point, I claim that the policy of the Federal Trade Commission (FTC) towards the business shapes the architecture of control over the U.S. users. The FTC is the U.S. agency for consumer protection. Its function towards cookies is to monitor their use by commercial entities. Currently, the FTC does not prescribe special guidelines about the use of cookies. The only strict instruction about the use of cookies regards Chil-dren’s Online Privacy Protection Act. According to this act, web sites are forbidden
57
to collect any information about the children under 13 years old, including not only name or location but also “hobbies, interests” (FTC web site 2006) which the cookie can reveal by tracking the browsing activity of the kids. In the U.S. the business relies on self-regulation about the use of cookies. However, FTC has the right to investi-gate if the use of a cookie is included in the privacy policy of a web site and if it is not clearly indicated, the FTC can “pursue action against the companies” (McCarthy and Yates 2010, 234) as it did in the case of DoubleClick. FTC also monitors if the web site is violating the terms and conditions of its own privacy policy and if it does, the Commission considers it “an unfair and deceptive trade practice” and the web site will be a subject of litigation initiated by the Federal Trade Commission. On the question whether the business must provide a clear notice about the use of cookies, the FTC answers positively only when the cookie “combine[s] the passively collected non-personal information with “personal information” (McCarthy and Yates 2010, 234). The respond of the Commission shows that it provides protection only when the collected data refers to “personal information” known in legal documents as personal identifiable information (PII). Regarding the personal identifiable information, the U.S. law does not have a single adopted definition but we can find such in the memo-randum from the Executive Office of the President as well as in the Online Privacy Protection Act (OPPA), which is effective in the California State Law. According to the Memodandum, personal identifiable information is:
“Information which can be used to distinguish or trace an individual’s identity, such as their name, social security number, biometric records, etc. alone, or when combined with other personal or identifying information which is linked or linkable to a specific individual, such as date and place of birth, mother’s maiden name, etc.” (Exacutive Office of the President 2007)
The OPPA is more precise in the definition and it describes PII as information which includes:
“(1) first and last name, (2) a home address (3) e-mail address, (4) telephone number, (5) social security number, (6) Any other identifier that permits the physical or online contacting of a specific individual, (7) Information concern-ing a user that the Web site or online service collects online from the user and maintains in personally identifiable form in combination with an identifier de-scribed in this subdivision.” (OPPA, Internet Privacy Protection, Business and Professional Code Section 22575-22579)
The collection of personal information in Internet falls under the strict principles of: (1) notice/awareness, (2)choice/consent, (3)access/participation, (4)integrity/secu-rity, and (5)enforcement/redress. For instance, the principle of notice requires that
58
a web site to provide customers with a notification about the collection of data “be-fore any personal information is collected from them” (FTC 2000). However, both definitions of personal identifiable information do not include information such as IP addresses or geolocation data which means that it is considered non-personal in-formation. Thereby, the use of the cookies by web sites and ad agents is not required to be notified in advance, except in the cases in which it is combined with collect-ing of personal information. The personal information of individuals is regulated be-cause in the in the U.S. law the term exists as referred to the issue of privacy, which is highly protected by variety of laws on online and offline privacy including: FTC’s Fair Information Practice Principles, Online Privacy Protection Act (OPPA) as well as the Fourth Amendment of the Constitution. Hence, the privacy of individuals is determined as protection only of PII and the tracing technology of the cookie does not violate the privacy of users.
The current policy of the FTC provoked a new actor - the congresswoman Jackie Spei-er, who introduced change in the course of the government policy towards cookies in the House of Representatives in February 2011. The proposal, known as Do Not Track Me Online Act, includes the protection of online activities of individuals, defined as “(1) the web sites and content, from such web sites accessed; (2) the date and hour of online access, (3) the computer and geolocation from which online information was accessed, (4) the means by which online information was accessed, such as a device, browser, or application” (Sprier 2011). The goal of the Act is to direct the FTC in de-veloping “Do not track” standards and regulations. At present, the Do Not Track Me Online Act is a Bill, meaning that it is a proposed law which will be monitored and evaluated by the U.S. government and in the future can be enforced as a law.
But currently, the FTC allows web sites to transmit the information stored in the cook-ie directory only under the condition that the web site has noticed this in its privacy policy. Furthermore, the FTC does not regulate the use of third-party cookies which can be attached to a web site that requires personal information. In this case, the user decides to disclose personal information based on one’s trust towards the web site and with agreement to the policies of this web site. But a third party cookie is sent without the knowledge and consent of the users and starts collecting the information which they reveal to the web site. The FTC also does not address issues such as the possibility of the user to choose the type of collected data nor the option of the user to see the kind of data that is stored. The current policy illustrates that the protection of uses’ data is matter of language and framing and an unclear definition mitigates the effectiveness of users’ protection.
Conclusion
59
By focusing on several actors in the particular market of the United States, the research revealed that the cookies’ design and affordance are embedded with commercial and political principles. Following Mirko Tobias Schäfer on his notion of technologies, we can say that the cookie is an object with inscripted “socio-political mindset” (Schäfer 2011, 12). The cookie is one of the tools that gave the former free “home of the mind” a memory on which users cannot impose their control. An opening of the cookie’s black box reveals a “punctualized network” (Law 1992) of heterogeneous actors and relations. In this network, resistance exists in the sensitive issues of privacy which EPIC raises in the public agenda in order to disrupt the network. But this resistance is predicted and the government’s strategy to overcome it is to create principles of protection with unclear language which justify the use of cookies. According to Wendy Chun, technology has moved the “paranoia” prevailing in the twentieth century from the “pathological to the logical” (Chun 2006, 1). This logic is embedded in the de-sign of U.S. Internet architecture. A possible change of the current policy is seen in the proposal of the congresswoman Jackie Speier but until it is adopted as a law, the use of cookies will continue to grow, driven by market and governmental forces. Up until now, the current architecture of the Internet in the U.S. is shaping a “society of control” (Deleuze 1992, 2) where the power over the individuals is exerted through invisible mechanisms. The cookie is indeed such an invisible tool, which does not al-low the control of the individuals over their data. In the time when the everyday life of people takes place in the online and in the offline space, when geolocating networks are prevailing and users leave all kinds of data as part of their daily practice, the af-fordance of the cookie opens the potential that the individuals can become what the philosopher Gilles Deleuze called “dividuals” (Deleuze 1993, ibid, 3). In this sense, the U.S. Internet playground creates users, made of “samples, data, markets, or ‘banks’” (ibid, 3), personal and “non-personal” information.
60
Akrich, Madeleine. 1994. The De-Scription of Technical Objects. Shaping Technol-ogy / Building Society: Studies in Sociotechnical Change (Inside Technology). Cam-pridge, MA: The MIT Press.
Barlow, John Perry. 1996. https://projects.eff.org/~barlow/Declaration-Final.html .
Baumer, David L., Julia B. Earp and J.C. Poindexter. 2004. Internet Privacy Law: a Comparison Between the United States and the European Union. Computers and Security 23: 400-412.
Bays, Hillary and Miranda Mowbray. 1999. Cookies, Gift-Giving and the Internet. First Monday 4, no. 11 (November) http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/700/610
Berghel, Hal. 2001. Caustic Cookies. Communications of the ACM 44, no. 5: 19-22.
Berghel, Hal. 2002. Hijacking the Web. Communications of the ACM 45 no. 4: 23-27.
Chun, Wendy Hui Kyong. 2006. Control and Freedom: Power and Paranoia in the Age of Fiber Optics. Cambridge, MA: The MIT Press.
David M. Kristol. 2001. HTTP Cookies: Standards, privacy, and politics. ACM Trans-actions on Internet Technology 1, no. 2: 151-198.
Deleuze, Gilles. 1992. Postscript on the Societies of Control. October 59: 3-7. http://www.n5m.org/n5m2/media/texts/deleuze.htm.
e-marketer. 2008. E-marketer web site.http://www.emarketer.com/Article.aspx?R=1006384.
EPIC. 2000. Complaint and Request for Injunction, Request for Investigation and for Other Relief. EPPIC web site. http://epic.org/privacy/internet/ftc/DCLK_com-plaint.pdf.
Federal Register. 2009. Proposed Revision of the Policy on Web Tracking Tech-nologies for Federal Web Sites. Cryptome web site. http://cryptome.org/0001/omb072709.htm.
Federal Trade Commissoin. 2006. How to Comply with the Children’s Online Pri-vacy Protection Rule. FTC web site. http://business.ftc.gov/documents/bus45-how-comply-childrens-online-privacy-protection-rule.
Federal Trade Commission. 2010. FTC web site. http://www.ftc.gov/reports/pri-vacy3/fairinfo.shtm.
Federal Trade Commission. 2000. FTC web site. Fair Information Practice Prin-ciples. http://www.ftc.gov/reports/privacy3/fairinfo.shtm.
Galloway, Alexander R. 2004. Protocol: How Control Exists After Decentralization. Cambridge, MA: The MIT Press.
References
61
Holahan, Catherine. 2006. Taking Aim at Targeted Advertising. BusinessWeek, No-vember 15, 2006. http://www.businessweek.com/technology/content/nov2006/tc20061115_360862.htm.
Hormozi, A. 2005. Cookies and Privacy. Information Systems Security 13, no. 6: 51−59.
Hsu, S. S., and Kang, C. 2009. U.S. Web-Tracking plan Stirs Privacy Fears. The Washington Post. http://www.washingtonpost.com/wp-dyn/content/arti-cle/2009/08/10/AR2009081002743.html.
Latour, Bruno. 2005. Reassembling the Social: An Introduction to Actor-Network-Theory. Oxford: Oxford University Press.
Latour, Bruno. 1994. On Technical Mediation-Philosophy, Sociology, Genealogy. Common Knowledge 3, no. 2: 29-64.
Latour, Bruno. 1993. We Have Never Been Modern. London, UK: Harvard Univer-sity Press.
Law, John. 1992. Notes on the Theory of the Actor Network: Ordering, Strategy and Heterogeneity. http://www.lancs.ac.uk/fass/sociology/papers/law-notes-on-ant.pdf.
Lessig, Lawrence. 2006. Code: And Other Laws of Cyberspace, Version 2.0. New York: Basic Books.
McCarthy, Laura and Dave Yates. 2010. The Use of Cookies in Federal Agency Web Sites: Privacy and Record Keeping Issues. Government Information Quarterly 27, no. 3: 231-237.
OPPA. 2004. Internet Privacy Requirements: Business and Professions Code Sec-tion. Leginfo Web Site. http://leginfo.ca.gov/cgi-bin/displaycode?section=bpc&group=22001-23000&file=22575-22579.
Raymond, E. S. 1996. Magic Cookie. The New Hacker’s Dictionary. http://www.eps.mcgill.ca/jargon/html/entry/magic-cookie.html.
Schäfer, Mirko Tobias. 2011. Bastard Culture! How User Participation Transforms Cultural Production. Amsterdam: Amsterdam University Press.
Spierer. 2011. Do Not Track Me Online Act. 2011. Jakie Spierer Official Web Site. http://speier.house.gov/uploads/Do%20Not%20Track%20Me%20Online%20Act.pdf.
Shah, Rajiv C. and Jay P. Kesan. 2009. Recipes for Cookies: How Institutions Shape Communication Technologies. London: Sage
Techno Journal. 2000. EPIC Files Complaint with FTC against DoubleClick. Techno Journal. http://www.techlawjournal.com/privacy/20000210.htm.
62
USA Today. 2010. These Cookies Aren’t Tasty; You are Left hungry of Privacy. USA Today, August 9, 2010 http://usatodayeducate.com/wordpress/wp-content/files/CS_Fall_2010_lesson5.pdf.
Williams, Jason. 2000. Personalization vs. Privacy: The great Online Cookie Debate. Editor & Publisher 133 no.9: 26–27.
Williams, Jason. 2000. Will Cookies Crumble in Court?. Editor & Publisher 133 no.6: 26–27.
63
Galloway opens his book by stating that, “all three [distributed network, computer
and protocol] come together to define a new apparatus of control that has achieved
importance at the start of the new millennium” (3). In this book Galloway explores
how control exists after decentralization. This control, according to Galloway,
manifests itself in the protocol, which can be defined as a management style¬ and
it “gains its authority from another place, from technology itself and how people
program it […] a type of controlling logic that operates outside institutional, gov-
ernmental, and corporate power, although it has important ties to all three” (121-
122). Galloway furthermore has a non-technical background, which could result in
some interesting insights into protocological control (xxiv).
Galloway’s argument of protocological control unfolds on a three-fold level. First
Galloway explores the ways in which control exist after decentralization, secondly
he explores the failures of protocol and lastly he ventures into the futures of pro-
tocol. Although the book clearly shows that Galloway has done extensive research
and goes into great detail, there are some shortcomings in his book.
In the first part Galloway assumes that we are now in an age of decentralization.
This in light of the architecture of the Internet, “[a] distributed architecture is pre-
cisely that which makes protocological/imperial control of the network so easy.
In fact, the various Internet protocols mandate that control may only be derived
from such a distributed architecture” (25) However, it could be argued that the
Internet is decentralized in nature and not distributed (Singel 2006). Or as Rush-
koff argued, that it is centrally controlled and not decentralized (Rushkoff 2011).
It depends on your point of view and which layer of control you want to address.
Secondly, Galloway explores the failures of protocol, how it is not allowed to come
to its full potential (120). It is the institutionalization that is contradictory, the bu-
reaucratic nature of the institutions (who rely on decision making and rules) goes
against the open nature of protocol. This part illustrates the contradictory nature
of protocol, “[it] is based on a contradiction between two opposing machines, one
machine that radically distributes control into autonomous locales, and another
that focuses control into rigidly defined hierarchies” (142). Additionally, there are
more contradictories in the protocol concept; this is visible throughout the book.
Protocol - Alexander Galloway
64
Reviewed by Ryanne Turenhout
The following citation captures part of the contradictory nature of protocol.
The contradiction at the heart of protocol is that it has to standardize in order
to liberate. It has to be fascistic and unilateral in order to be utopian. It con-
tains, as Jameson wrote of mass culture before it, both the ability to imagine
an unalienated social life and a window into the dystopian realities of that
life.” (95).
Other contradictions are outlined in the first chapter in the discussion between the
protocols TCP/IP and DNS, where one is distributed in nature and the other in-
herits hierarchy. These contradictions in the protocol concept make it all the more
difficult to grasp or comprehend, which is something to be avoided when trying to
make an argument or explain a concept.
The third part of the book is about the futures of protocol. In which Galloway ex-
plores the ways in which resistances against protocol can manifests itself. Resist-
ance always has to come from within protocol itself, “I suggest then that to live in
the age of protocol requires political tactics drawn from within the protocological
sphere” (151). This is a bit of a bold statement, which suggests that there is nothing
outside protocol that can resist the protocol. Examples that Galloway gives of re-
sistance within protocol are hacking, computer viruses and tactical media, among
others.
With the exception of the fourth chapter, Galloway’s book puts with the protocol
concept too much emphasis on the technological aspects. As he himself disclaims
because much research has been done on the level of “law, Internet governance,
state sovereignty, commercial power or the like” (18). It is argued here that by do-
ing so, Galloway creates a technocratic separation between technology on one hand
and culture, society and politics on the other hand. It is argued here that culture
and technology are intertwined (Schaefer and Rieder 2008, 3-11). Furthermore, as
argued by Bruno Latour, responsibility must be shared among the various actants,
and therefore cannot just be subscribed to a single actor (Latour 1999, 180).
Galloway also states that protocol gets its authority from how people program it
(121). Here Galloway omits, or does not acknowledge, that lawmakers can also be
seen as code writers. “They [lawmakers] determine what the defaults of the Inter-
65
net will be; whether privacy will be protected; the degree to which anonymity will
be allowed; the extent to which access will be guaranteed” (Lessig 2006, 79). This
relates to the previous argument made in this review with respect to the Actor-
Network Theory from Bruno Latour.
Overall, this book and the concept of protocol is an extensive and important addi-
tion to the academic society, albeit too technocratic in form and contradictory in
nature. The importance lies in that it tapes into a niche and relatively unexplored
territory. The danger hereby lies in that it can become to specific or technical and
purposely negates the important factors as outline in this review. However, it is re-
freshing that his ‘outside’ view showed itself in the book. Galloway refrained him-
self from falling into too much technical jargon and used the works of Foucault,
Deleuze and other philosophers to support his arguments. In that respect it is in-
deed interesting to have someone with a non-technical background attack such a
technical research topic.
References
Galloway, Alexander R. 2004. Protocol, how control exists after decentralization.
Cambridge, Massachusetts: The MIT Press.
Lessig, Lawrence. 2006. Code version 2.0. New York City: Basic books.
Rushkoff, Douglas. 2011. Internet is easy prey for governments. CNN, February 5.
http://articles.cnn.com/2011-02-05/opinion/rushkoff.egypt.internet_1_inter-
net-wikileaks-networks
Schaefer, Mirko Tobias, and Bernhard Rieder. 2008. Beyond engineering. Soft-
ware design as bridge over the culture/technology dichotomy. Philosophy and De-
sign: From Engineering to Architecture, ed. P. E. Vermaas, P. Kroes, A. Light & S
A. Moore, 152-164. Dordrecht: Springer.
Singel, Ryan. 2006. They saved the Internet’s soul. Wired magazine, February 8.
http://www.wired.com/science/discoveries/news/2006/02/70185
66
“You’re dangerous with a phone. Remember what you did when you were alone with a phone in Prague? Remember how many people died?” – Mitchell Royce, newspaper editor. Transmetropolitan, issue 1, Summer of the Year.
Welcome to the wondrous world of Tomorrow. Wait, did I say “wondrous”? I meant horrific. Corrupt politicians are manipulating the voters, media is too obsessed with ratings to promote real journalism and the sated middle class cares little about things outside the next immediate gratification. Wait, did I say “tomorrow”?
It is important for scholars to study fiction. Writers, lacking the scrutiny of academic writing and peer review, have a surprising tendency to paint much better picture of society than professional academics. Literature is also a constant laboratory for generating ideas and concepts, which are later introduced to society. Few remember today that the popularity of the “cyber-“ prefix, used in academia, business and poli-tics have originated in science fiction writer’s William Gibson description of future virtual realms.
Only time will tell whether Ellis and Robertson would supplement the language in a way the Gibson did. But one thing is sure: today, a decade after their comics series “Transmetropolitan” was concluded, their prognosis is frightening in its insights. Their creation paints a picture of a society in a truly post-industrial and post-modern state, where information is main commodity and the everyday citizen is relentlessly bombarded by commercial and political messages. I would like to concentrate on a specific issue prevalent through the books: lack of control.
The plot is as follows. Spider Jerusalem, fabled journalist and columnist returns to the City – a nameless, massive North American metropolis of the future -from a 5-year self imposed exile. He left the city because, as he so depressingly puts it, he “couldn’t’ get at the truth anymore” (issue 1, 19) and therefore could not write any-more. Armed with a laptop, two assistants/ friends, and complete lack of restraint, he tries to paint us a picture of the bizarre futuristic world in which he leaves. In later stages, the story becomes rather political (with strong overtones of the first George W. Bush administration), and Spider is forced to flee from corrupt politician, eventu-ally elected the president, and uncover his true face of a murderer, slaver and sadist.
While the main plot is interesting by itself, it is the background that I wish to focus on. The City, brought to life by myriad minor details mentioned in text and in the rich detailed drawings, is heaven for consumerism. Basic products are available from nano-molecular “makers”, affordable to anyone with a decent income. The basic
Transmetropolitan - Warren Ellis and Darick Robertson
67
Reviewed by Alex Gekker
TV package includes over 2000 channels. Trained “listeners” prowl the streets of the City armed with best audio-video equipment, acting as live recording devices for the thousands of cultures that inhabit it. A new religion is invested every hour (issue 6, 7), to gather believers and evade taxes.
In an issue devoted solely to television, Spider writes a column after spending the day exclusively in television watching. The talk show host invites the viewers to par-ticipate in the debate and mentions off-hand that “calls are free but we will trace and tag your line for advertising purposes” (issue 5, 13). For us this is a shocking state-ment. Today we still perceive gathering of personal information by commercial enti-ties as negative. Laws are regulations are created in order to prevent such behavior. One’s phone number is a personal thing, and privacy concerns stop us from handing it out freely. But if we observe the trends in social media platforms, where participa-tion is traded for personal data, we can see how the situation described by Ellis and Robertson is simply a logical conclusion. These are not social platforms that trace your information – these are “serious” political talk-shows.
At the end of the issue, the channel suggests to stand-by for “block consumer incen-tive” (issue 5, 20). Spider’s assistant screams for him to shut his eyes, but as he just recently returned to the city, he’s not aware of the danger. The television flashes and he’s left with vague confusion and tiredness. Despite his assistance pledges, he goes to sleep but wakes up after having several disturbing, highly-commercial dreams. The assistant explains to him that he was just exposed to an “add-bomb”, a tech-nique for product-placement directly into one’s sub-consciousness, resulting in com-mercial ads occurring directly in dreams. Again, the proposition is grotesque for us, but when analyzing current trends, this again is but a reasonable development. Ellis and Robertson warn us of the inability to filter out commercial messages we already experience today: from subtle product placements in TV shows and to ubiquitous, location based ads that follow us around the city.
The authors draw a coherent picture of future network society. Despite several breaks and already obvious anachronisms (separation between the print-audio-vid-eo aspects of media) the picture is troubling. A neo-Nazi group uses cheap G-Reader (a device to read off genetic information at distance) to detect a “problematic” gene in random teenager on the street, and beat him to death (issue 28, 6-12). Genetic information is considered nowadays “the next big thing” for companies. Opt-outs are becoming method of choice for consumer services. When your genetic markup is part of your available profile, the authors warn, opt-out may not be possible.
Under the guise of nano-technology, genetic modifications and alien life forms, Transmetropolitan presents the dilemma we face today. The more technology inter-
68
feres with our lives, while guided by a mix of commercial and political motivations, the more we become powerless to prevent or alter the cost we pay in the loss of con-trol over what we share and with whom.
69
Today, the Internet has become the space where the everyday takes place. As the
importance of the Internet in our society grows, it also provokes the analysis of
modern philosophers. Scholars such as Jonathan Zittrain and Wendy Chun have
given a raise to issues about the problematic future of the internet and the limited
freedom of individuals. In this debate, the new media lawyer Lawrence Lessig ex-
pands the discussion in the perspective of regulation. In his book Code: And Other
Laws of Cyberspace, Version 2.0, he presents the problems that the U.S. govern-
ment faces in finding the way to create effective regulation in Internet. Lessig’s
background as a co-founder of the copyright license Creative Commons reveals
his views about the need for new type of laws which is a basic notion in his book.
An interesting fact it that the book is a second edition, which was written in a Wiki
page by the collaboration between Lessig and scholars and students from Stanford
University (x). In the book, Lessig reveals the power of the Internet code to regu-
late. But behind the regulatory mechanisms that he presents, I argue that this book
is more about human values and the way they exist and can be protected in the
digital space.
In the book Lessig states that depending on the design, the architecture of the Net
can promote or put in risk certain values. He states that the code of the Internet will
be “the greatest threat to both liberal and libertarian ideals, as well as their great-
est promise” (6). For instance, in chapter six, he points out that the architecture of
the cyber-community Counsel Connect (94) enabled the members to create valu-
able debates and relations. Lessig believes that initially, the Internet was against
regulation and the values it preserved were values of “freedom” (309). However, in
the first part of the book he states that a system of “perfect control” (6), driven by
commercial and governmental powers is reshaping the original infrastructure. In
several chapters he warns us that changes in the design put certain values “at risk”
(198). He emphasizes on the values of intellectual property and addresses the need
for new laws in the protection of the rights of the producers. Privacy is the other
major value that Lessig argues that is threatened on the Internet. In part one, Les-
sig presents that now-a-days, commercial forces have created a variety of tractable
mechanisms, such as IP traces and cookies. Via these mechanisms uses can not
only be monitored, but can be used to boost the business of behavioural advertis-
ing. Lessig describes Google’s policy of keeping searches and e-mails, the digital
Code version 2.0 - Lawerence Lessig
70
Reviewed by Kalina Dancheva
surveillance of the U.S. government and user profiling by advertisers. Jonathan
Zittrain also supports Lessig in this concern by stating that we are living in an “era
of cheap technologies” (Zirrtrain 2008, 2008).
In part three Lessig expresses the concern that currently the U.S. government can-
not effectively protect the values of its citizens. The book presents that the current
laws are based on principles defined in the Bill of Rights where rights were relevant
in protecting values the physical world but cannot cope with cases of interventions
in the online space. Lessig also raises a potential problem for protecting the values
due to the international dimensions of Internet which meet conflicting national
values. He suggests that this “duality” (301) can be overcome by “The Many Laws
Rule” which includes identifiable technologies. Contrary to Lessig, Jack Goldsmith
and Tim Wu argue that this duality is not a new phenomenon for lawmakers and
developing international laws can be seen as the “global solution” (Goldsmith and
Wu, 164).
It is important to mention that Lessig raises questions about the appropriate way
for protecting the values but moreover – he proposes solutions. For the intellectual
property rights this can be the Creative Commons licence, which allows the authors
to define the scope of rights they permit. Lessig argues that the same government
that decides to impose control without the consent of the users can also decide
to provide the control in the hands of the users. Such a solution he sees in the
development of an additional identification layer or what he calls a virtual wallet.
This tool will benefit the users by providing them with the choice to reveal mini-
mum information when such is required. However, Lessig does not elaborate on
the question of how the collected information will be managed in order to ensure
the protection of data and users’ privacy.
In the last part of the book, Lessig suggests that the digital space will require gov-
ernments to make the strategic decision of how to rule. I claim that this decision
will illustrate their own values and principles wish will shape the architecture of
the Internet. One way is through a “closed code”, which will make the regulation
invisible (138). The other way is through “open code”, which possess the values of
governing in transparency and it means that control exists, but the users are aware
of it (151). Governing through open code is also what Lessig believes it is the right
way to regulate in the contemporary world.
In conclusion, Code is a book that teaches us on the importance of finding the right
balance between governmental and user control. The drawback of the book is that
it presents the reality in the United States but it does not reveal much about the way
71
values are protected in other markets, such as the European Union. However, Code
is a necessary book not only in the academic field of law but also in social-political
and new-media studies. It is important because it reveals how the contemporary
infrastructure of the New is being transformed and suggests possible solutions of
how essential human values can be effectively protected in the digital space.
References
Goldsmith, Jack and Tim Wu. 2006. Who Controls the Internet? lllusions of Bor-
derless World. New York, NY: Oxford University Press 2006.
Zittrain, Jonathan. 2008. The Future of the Internet--And How to Stop It. Yale
University Press.
72
We encourage you to participate and make your voice heard.
(Mark Zuckerberg on Facebook)
The interplay between user participation and cultural industries is widely de-
scribed by Schäfer in his book Bastard Culture! How User Participation Trans-
forms Cultural Production. Participants and media practices are blended together
and forming the so called ‘bastard culture’. Facebook is one of the media practices
that involve the enablement of technologies empowering passive consumers for
active participation in Web 2.0. Facebook facilitates easy publishing and sharing of
content. Zuckerberg’s quotation on Facebook explains that participation is encour-
aged within Facebook’s practices. These practices do not only refer to user-created
content but also involves the further development with the use of Application Pro-
gramming Interfaces (APIs). APIs enable users to connect various applications and
sources and use them for different purposes (106). Facebook offers a platform for
a developing community that is interested in developing an application using an
API1. “Facebook […] as platform provider has successfully commodified user ac-
tivities by implementing them into new business models, which again raises the
issue of corporate control and ownership structures” (Zimmer 2008).
The example of Facebook is used by Schäfer (2011) as a concept of “‘user and soft-
ware governance’: it turns companies and users in something more similar to a
‘society’, where through various processes of interaction both sides try to balance
their various interests in a sort of ‘agreement’” (171). In the context of Web 2.0 us-
ers’ activities integrate into new business models. According to O’Reilly, Web 2.0
is perceived as ‘architecture of participation’ (O’Reilly 2005), “a term that clearly
points to an understanding of participation generated by design options rather
than community spirit” (105).
Schäfer provides a rich analysis of participation and participatory culture, accu-
mulating various theories and interesting case studies setting in the context of
Web 2.0 as contemporary cultural practice of computer use. The term participa-
tory culture was initially introduced by Jenkins to distinguish active user partici-
pation in online cultural production from an understanding of consumer culture
1 Facebook developers can be accessed at http://developers.facebook.com/
Bastard Culture! - Mirko Tobias Schäfer
73
Reviewed by Lindsy Szilvási
where audiences consume corporate media text without actively shaping, altering,
or distributing them (1991, 2006a, 2006b, Jenkins et al. 2006). Participatory cul-
ture is described by Schäfer as a dynamic interaction between users, corporations,
discourses, and technologies, and indicates how participation extends cultural
industries in terms of co-creation and development of media practices. Cultural
industries provide platforms for user-generated content; whereas the created con-
tent is used through the appropriation of the corporate’s design and goals. In this
way, participatory culture represents a socio-political understanding of technol-
ogy advancements where control is exercised, and participation has become a key
concept to frame the emerging media practice, which is placed in a socio-technical
ecosystem environment by Schäfer.
The importance of socio-technical ecosystem as an environment of participation is
based on “information technology that facilitates and cultivates the performance of
a great number of users” (18). In proving an analytical framework, Schäfer refers to
actor-network theory of Latour to cover the complexity of design and user activities
as different constituents of participatory culture and that are intertwined within
the system. Unlike other researches in the field of participatory culture, Schäfer
does not romanticize users’ participation:
I argued extensively against the rosy picture of user participation, not only
because it describes the phenomenon of participation insufficiently, but also
because it’s illusionary rhetoric neglects the problems at hand and serves
‘a self-incurred immaturity’. Providing an analysis of the actor networks in-
volved in shaping our cultural reality through patent laws, regulations, and
technological design can contribute significantly to making socio-political
dynamics public and comprehensible to a broader audience.
(Schäfer 2011, 173).
The unfolding online cultural production by users has been enthusiastically argued
but Schäfer steps beyond the ‘romantization’ of user participation and analyzes it
in the context of accompanying popular and scholarly discourse, and the practic-
es of design and appropriation. Revealing the actors within participatory culture,
Schäfer provides an analysis that does not create utopian or cultural pessimistic
assumptions. His aim is to bring awareness to the ‘flip-side’ of user participation,
as a complex and dynamic process of power relations.
74
The ‘flip-side’ of user participation could have been collaborated more on media
education, which is described by Jenkins et al. (2006) as the three concerns re-
garding participatory culture called ‘participation gap’, ‘transparency problem’,
and ‘ethics challenge’. The first concern involves the inequalities of access to new
media technologies and the opportunities for participation. The second concern
describes the consequences of increased access to information that leads to an in-
creasingly difficulty of interpretation. In other words, the facility and tools that are
provided with technology does not necessarily make them easy to use or interpret.
The last and third concern deals with the complex and diverse social online envi-
ronment, and refers to the “breakdown of traditional forms of professional train-
ing and socialization that might prepare young people for their increasingly public
roles as media makers and community participants” (Jenkins et al. 2006, 5).
Jenkins et al. (2006) does not agree that the acquirement of tools and facilities
for participation will be simultaneously by interacting with popular culture. They
suggest the need for policy and pedagogical interventions. This is in contrast with
Schäfer’s argument that the easy-to-use interfaces make participation possible,
and that users do acquire the facilities and tools for participation.
Nonetheless, Schäfer’s multidisciplinary theory forms an analytical approach to
reveal socio-political factors and insisting on bringing a more critical scholarly ap-
proach toward participatory culture.
References
Jenkins, Henry. 1991. Textual Poachers. Television Fans And Participatory Culture.
New York: Routledge.
Jenkins, Henry. 2002. Interactive Audiences? The Collective Intelligence Of Media
Fans. In The New Media Book. Dan Harries (Ed.). London: BFI. http://web.mit.
edu/21fms/www/faculty/henry3/collective%20intelligence.html
Jenkins, Henry. 2006a. Fans, Bloggers, And Gamers: Exploring Participatory Cul-
ture. New York: NYU Press.
Jenkins, Henry. 2006b. Convergence Culture. Where Old And New Media Collide.
New York: NYU Press.
75
Jenkins, Henry et al. 2006. Confronting The Challenges Of Participatory Culture:
Media
Education for the 21st century. MacArthur Foundation. http://www.digitallearn-
ing.macfound.org/atf/cf/%7B7E45C7E0-A3E0-4B89-AC9C-E807E1B0AE4E%7D/
JENKINS_WHITE_PAPER.PDF
O’Reilly, Tim. 2005. What Is Web 2.0: Design Patterns and Business Models for
the Next
Generation of Software. O’Reilly Net. http://oreilly.com/web2/archive/what-is-
web-20.html
Schäfer, Mirko T. 2011. Bastard Culture! How User Participation Transforms
Cultural Production. Amsterdam: Amsterdam University Press.
Zimmer, Michael. 2008. The Externalities Of Search 2.0: The Emerging Privacy
Threats
When The Drive For The Perfect Search Engine Meets Web 2.0. First Monday
13,no.3.http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/
view/2136/1944
76
The Listeners
Having come from off the grid, Zarathustra returns saddened by the sight of his
brethren. Then he drew a breath and spoke to them:
When I left the grid, it was small, and you were free. Now you spend your time in the
playgrounds of others. These hosts who introduce you to each other, and whom tell
you fun is free
Yet you use their slides and swings, which speed things up, and bring things down.
Like children here you find yourself in a playground. But up to down, front to back,
an insight into the way these things exert control you lack1
Remember once from the freedom whence you came. Now the gates behind you are
closed. Once creators you were, now your only lust is to consume, to undergo. With
song and laughter, you have committed yourself to your hosts’ frame2
Amongst each other are you on this playground. Save for the host, no one here sets
the bounds. And like a host, he chooses why and who to evict. Here you have no say
in the rules, save but to quit. Whatever the hosts serve, you will eat. And so without
knowing, you surrender the tracks of your feet3
Amongst you are those who attempt to resist. Who break their cage and shake their
fists. But masked like some guy Fawkes4, they do not remember, all it did was pro-
voke government, that fifth of November.5 And a master for a master, is a bad trade.
Your resistance is futile, if you remain submitted to such saviors. It is a self-fulfilling
fate. Being enthralled by new found transparency6, without the individual refusing to
play comes no change.
But even your host leaves you virtually7 uninhibited. So your incessant talk, and
1 See R. L. Turenhout’s contribution to this journal2 See L. Silvasi’s contribution to this journal3 See K. Dancheva’s contribution to this journal4 http://mouemagazine.wordpress.com/2008/07/21/alan-moore-mentions-anonymous-pro-tests/5 See A. Gekkers contribution to this journal6 Jodi Dean 2003, page 110: “All sorts of horrible political processes are perfectly transparent today. The problem is that people don’t seem to mind, that they are so enthralled by transparency that they have lost the will to fight (Look! The chemical corporation really is trying. . . Look! The govern-ment explained where the money went. . .).”7 Note the double meaning
Free style - The Listeners
77
Reviewed by Bob van de Velde
waltzes from one piece of playground to the other. In the plays you play, you mind
is exhibited. Offered are many choices, many voices to be heard, and you can repro-
duce all kinds of words.8 The control lies in yourself9, but if as children you behave,
CAN you control yourself?
For through choices we create. And all which has been created is history10. And
although our paths change, our trails do not. Which we see when remembered of
something we forgot. Whether we try to go past infamy or fame, what has been can
be as hard to change, as a name11
Our hosts have long ears, and penetrating gazes. From the steps we take, they rec-
ognize our faces12. A thousand mechanical homunculi tick in unison to serve our
hosts13, acting as keepers of our secrets. You act as if though you lack control, but
is not control of yourself you want. Your shudder in thought of those who have lis-
tened, but never keep your mouths shut. And you see only the listener to blame, as if
you yourself did not participate in the host’s game.
And also your hosts speak in split tongues. He promises not to kiss and tell14, Yet
with the same breath, has left all intentions fall15. Yet we care not for his deceit, for
the games you play there, seem perfectly free. And for protection from solitude and
ennui, you dance your dances, and forget control so as to be free.
Here the youth intervened, for he said: What beef you have with simple games? We
youth need play, and for new contacts we aim! Thou shallow representation of our
constitution, only serves to invite your retribution. Thou sees not how unlimited our
appetites are, and how our dances and games carry far. This is but a fact of a nature
of our souls, not something we should be made to control!
8 “specific articulations reproduce, … by fixing meaning in particular ways’ Jorgensen, Mari-anne and Louise Phillips. 2002: 299 Deleuze , Gilles. 1992. Postscript on the Societies of Control. October 59: 3-7.10 On a web site, the site can track not only your purchases, but also the pages that you read, the ads that you click on, etc. How Internet Cookies Work by Marshall Brain200111 Eric Smidt: http://www.telegraph.co.uk/technology/google/7951269/Young-will-have-to-change-names-to-escape-cyber-past-warns-Googles-Eric-Schmidt.html12 For others, mysterious corporate ‘‘cookies,’’ allegedly capable of following our every move, or voracious ‘‘packet
sniffers’’ epitomized the risk of going online. Wendy Hui Kyong Chun 200613 The servers used to store information about users http://www.pandia.com/sew/481-gartner.html 14 http://arstechnica.com/tech-policy/news/2010/05/facebooks-zuckeberg-admits-mistakes-promises-privacy-fixes.ars 15 http://www.businessinsider.com/well-these-new-zuckerberg-ims-wont-help-facebooks-privacy-problems-2010-5
78
Here Zarathustra raised his voice: A fool are you to reason so! Too weak are you for
self control, thus scrutinize the host, a long eared host who runs the show. You seek
only to bind him to your power, by setting limits on what should be remembered,
in order to construct a Babel tower, where you hosts no longer understands your
speech!
Is it not wrong to limit another to be strong? To bind the memory of other, so you
need not keep your tongue? I say you, you are a thing to be surpassed. Your illusion
of innocence cannot last. You shelter yourself in shadows to avoid blame, and con-
tinue to participate for social fame! Cast of you masks and threads of illusionary civi-
lization, as now they are stripped from you, resulting from willful participation!
- Thus spoke Zarathustra,
- So tweeted the youth
79