Privacy in Social Networks Economic Options for Regulation

186
Inaugural-Dissertation zur Erlangung der Doktorwürde der Wirtschafts- und Verhaltenswissenschaftlichen Fakultät an der Albert-Ludwigs-Universität Freiburg i. Br. Vorgelegt von Claus-Georg Nolte Geboren in Siegburg WS 2017/18 Privacy in Social Networks – Economic Options for Regulation Albert-Ludwigs-Universität Freiburg Wirtschafts- und Verhaltenswissenschaftliche Fakultät

Transcript of Privacy in Social Networks Economic Options for Regulation

Inaugural-Dissertation

zur

Erlangung der Doktorwürde

der Wirtschafts- und Verhaltenswissenschaftlichen Fakultät

an der Albert-Ludwigs-Universität Freiburg i. Br.

Vorgelegt von

Claus-Georg Nolte

Geboren in Siegburg

WS 2017/18

Privacy in Social Networks –

Economic Options for Regulation

Albert-Ludwigs-Universität Freiburg

Wirtschafts- und Verhaltenswissenschaftliche Fakultät

Albert-Ludwigs-Universität Freiburg im Breisgau

Wirtschafts- und Verhaltenswissenschaftliche Fakultät

Kollegiengebäude II

Platz der Alten Synagoge

Dekan: Prof. Dr. Alexander Renkl

Erstgutachter: Prof. Dr. Dr. h.c. Günter Müller

Zweitgutachter: Prof. Dr. Dieter K. Tscheulin

Datum der Einreichung: 19.12.2017

Datum der Promotion: 21.02.2018

Acknowledgements

In the course of developing this doctoral thesis and during my doctorate, I have enjoyed the

continuous support of a great number of people I would like to express my sincere gratitude.

First and foremost, I want to express my deep gratitude to my doctoral supervisor Prof. Dr. Dr.

h.c. Günter Müller for his mentorship and support over the past couple of years. I am very

thankful for his friendly attitude towards me, which led to many interesting and insightful

discussions even beyond the scientific scope. His advices and recommendations were extremely

valuable to my professional as well as personal development and of great help in writing this

dissertation. Overall, I always felt to be trusted and was granted the academic freedom to pursue

the objectives I was interested in. I would further like to express my gratitude towards Prof. Dr.

Dieter K. Tscheulin for accepting the task of the second reviewer for the dissertation and as

examiner in my Rigorosum. Also, I would like to thank Prof. Dr. Günter Knieps for his role as

examiner in the Rigorosum. Additionally, I want to thank the people at the Friedrich-Ebert-

Foundation for offering me a doctoral scholarship and their general support with this thesis.

I would also like to thank my colleagues and friends at the Institute of Computer Science and

Social Studies in Freiburg. In particular, I want to express my sincere appreciation to Dr.

Christian Zimmermann for his support with my first scientific contributions, his continuous

help in manifold topics and the many memorable experiences we had. Furthermore, my thanks

go to Dr. Christian Brenig for the fruitful discussions on research related topics and beyond.

Moreover, I want to thank my fellow colleagues Adrian, Arnt, Julius, Markus, Nadine, Richard

and Thomas for their efforts to support me and the friendly atmosphere at the Institute. I am

very happy that I have been able to work with such great colleagues and friends. I further thank

Anja Jülch for her help with administrative issues as well as Jonas Schwarz and Jonas

Kaltenbacher for their professional assistance.

Last but not least, I want to express my deepest gratitude towards my family. All this would not

have been possible without the unconditional support of my parents throughout my whole life.

I would like to especially thank them for their confidence in me, all the possibilities they offered

me and their encouragement in writing this thesis. I am also particularly thankful to my lovely

girlfriend Birthe, whose unreserved backing, patience and professional advice helped me to

write and especially to finalize this dissertation.

Cover design: Malte Rosenberg

Cover image: Gerd Altmann

i

Table of Contents

Table of Contents .................................................................................................. i

List of Figures ..................................................................................................... iv

List of Tables ....................................................................................................... vi

List of Abbreviations ....................................................................................... viii

1 Privacy in the Age of Social Network Services ............................................. 1

1.1 Privacy Factors in SNSs and Their Environment ...................................... 3

1.1.1 Personal Data ................................................................................. 3

1.1.2 Social Network Service.................................................................. 4

1.1.3 Social Network Operator ............................................................... 6

1.2 A Valid Privacy Definition for SNSs ........................................................ 7

1.3 Research Questions and Objectives ........................................................... 9

1.4 Outline and Contributions ........................................................................ 13

1.5 Remarks on Publications ......................................................................... 17

2 Interrelations of Privacy in the SNS Platform Environment ................... 19

2.1 A Framework for SNSs as Platform Businesses ...................................... 20

2.1.1 Approach ...................................................................................... 21

2.1.2 Platform Participants .................................................................... 22

2.1.3 Platform Architecture................................................................... 23

2.1.4 Platform Governance ................................................................... 23

2.1.5 Platform Effects ........................................................................... 24

2.2 System Dynamics Modelling of the SNS Environment .......................... 26

2.2.1 The Core....................................................................................... 26

2.2.2 The Periphery ............................................................................... 28

2.3 Insights from the SNS SD Model ............................................................ 31

2.3.1 Core Model Results...................................................................... 31

2.3.2 Extended Model Results .............................................................. 31

2.3.3 Limitations ................................................................................... 33

2.4 Conclusion ............................................................................................... 33

3 Economic Modelling of Privacy in SNSs ..................................................... 35

3.1 Case Study: PD as a Payment Method in SNSs ...................................... 36

3.1.1 Related Research .......................................................................... 36

3.1.2 Hypothesis and Procedure ............................................................ 37

3.1.3 Survey Results ............................................................................. 40

3.1.4 Regression Analysis ..................................................................... 40

ii Table of Contents

3.1.5 Regression Results ...................................................................... 42

3.1.6 Overall Results ............................................................................ 43

3.2 Modelling the Principal-Agent Dilemma in SNS ................................... 45

3.2.1 Related Work ............................................................................... 46

3.2.2 Model Assumptions ..................................................................... 46

3.2.3 Designing the Principal-Agent Model ......................................... 48

3.2.4 Balancing User Privacy and SNO Profit-Seeking ....................... 51

3.3 Conclusion ............................................................................................... 58

4 SNS Privacy and Regulation ....................................................................... 59

4.1 Economic Requirements for SNS Privacy Regulation ............................ 60

4.1.1 Related Literature ........................................................................ 61

4.1.2 Identified Privacy Threats and Data Types ................................. 63

4.1.3 Developing the Multidimensional Privacy Framework .............. 65

4.1.4 Framework Range of Coverage ................................................... 72

4.1.5 Framework Discussion ................................................................ 75

4.1.6 Potential Applications and Implications ...................................... 76

4.2 The EU Approach to Privacy .................................................................. 78

4.2.1 Related Literature ........................................................................ 79

4.2.2 Recapitulation: The Multidimensional Privacy Framework ....... 80

4.2.3 The GDPR: Genesis, Structure and Aim ..................................... 81

4.2.4 Interim Result: Conditions and Classifications met by the GDPR92

4.2.5 Evaluation: Does the GDPR cover it all? .................................... 93

4.2.6 Legal and Economic Capability .................................................. 99

4.2.7 Technical Feasibility ................................................................. 105

4.3 Conclusion ............................................................................................. 107

5 Social Network Services: Competition and Privacy ................................ 109

5.1 SNS Market Competition and its Influence on Privacy ........................ 110

5.1.1 Economic Analysis: Requirements and Model ......................... 111

5.1.2 Reality Check and Market Development .................................. 118

5.2 Enhancing Privacy Competition ........................................................... 123

5.2.1 The Right to Be Forgotten ......................................................... 123

5.2.2 The Right to Data Portability .................................................... 125

5.2.3 The Concept of Interoperability ................................................ 126

5.2.4 The Concept of Privacy Trust Banks ........................................ 130

5.2.5 Further Ideas .............................................................................. 133

5.3 Conclusion ............................................................................................. 136

Table of Contents iii

6 Does Economic Competition and Regulation Provide a Solution for Privacy in SNSs?

...................................................................................................................... 139

6.1 Summary and the Main Results ............................................................. 139

6.2 Implications for Future Research ........................................................... 143

Appendix ............................................................................................................... i

1. References ......................................................................................................... i

iv

List of Figures

Figure 1. Schematic Representation of Transactions in DCB (Source: [42]). ........................... 4

Figure 2. MSP model, including platform participants (I&II), same-side network effects (1&4),

and cross-side network effects (2&3; according to [40]). .......................................................... 5

Figure 3. Triangular Business Interaction. ................................................................................. 7

Figure 4. Outline of the Dissertation. ....................................................................................... 14

Figure 5. SNS core and periphery illustration loosely based on [44]. ..................................... 22

Figure 6. SD SNS Core Model................................................................................................. 28

Figure 7. SD SNS Core & Periphery Model. ........................................................................... 30

Figure 8. Survey Composition. ................................................................................................ 39

Figure 9: Instruments for Addressing the Privacy Problem in SNS. ....................................... 55

Figure 10. Privacy Threat Cause-Effect Relationships. ........................................................... 64

Figure 11. The Privacy Dimensions. ........................................................................................ 65

Figure 12. Conceptualization of Accountability as a Privacy Principle (taken from [196]). ... 67

Figure 13. Privacy Threats Embracing the Framework. .......................................................... 75

Figure 14. GDPR's Influence on the SNS Market Structure. ................................................. 108

Figure 15. The Market Structure of SNS. ............................................................................... 114

Figure 16. SNO Perspective on Competition. ......................................................................... 117

Figure 17. Social Media Ads used regularly by Marketers According to [253]. ................... 121

Figure 18. Interoperability: SNS Market Role and Impacts. ................................................. 129

List of Figures v

Figure 19. Privacy Trust Bank: Market Role and Impacts. .................................................... 133

Figure 20. Histogram of Age. ..................................................................................................... x

Figure 21. Histogram of FB Friends. ......................................................................................... xi

Figure 22. Histogram of Privacy Awareness. ............................................................................ xi

Figure 23. Distribution of Age through Experiment Groups. ................................................... xii

Figure 24. Distribution of FB Friends through Experiment Groups. ....................................... xii

Figure 25. Distribution of used FB Functions through Experiment Groups. .......................... xiii

Figure 26. Distribution of Users' Privacy Awareness through Experiment Groups. ............... xiii

Figure 27. Distribution of Degree of FB Using through Experiment Groups. ........................ xiv

Figure 28. Distribution of chosen Functions through Experiment Groups. ............................ xiv

Figure 29. Bivariate Regression for func.diffi with exp.gri as explanatory Variable. .............. xvi

vi

List of Tables

Table 1. Direct and Indirect Related Publications. ................................................................... 17

Table 2. Literature Review Distribution. ................................................................................. 21

Table 3. Chosen Functions for Different Experimental Groups .............................................. 40

Table 4. Multiple Regression for Selected Functions in the Experiment. ................................ 41

Table 5. Multiple Regression for Used FB Functions .............................................................. 42

Table 6: Applicability of high-level approaches in different scenarios .................................... 53

Table 7. Identified SNS Privacy Threats .................................................................................. 63

Table 8. Horizontal Privacy Conditions and Classifications. ................................................... 67

Table 9. The Multidimensional Privacy Framework. ............................................................... 71

Table 10. Privacy Conditions and Classifications met by the GDPR. ..................................... 93

Table 11. Privacy Types covered by the GDPR. ...................................................................... 96

Table 12. Privacy Sectors covered by the GDPR, distinguished to four different Scores. ...... 98

Table 13. Feature of Goods Classification for SNS. ............................................................... 113

Table 14. SNO Activities and Their Influence on User Privacy. ............................................ 118

Table 15. Breakdown of the Literature Review. ......................................................................... i

Table 16. Survey Design. ........................................................................................................... ii

Table 17. Distribution FB Users ................................................................................................ ii

Table 18. Gender Distribution. ................................................................................................... ii

Table 19. Educational Achievement Distribution. .................................................................... iii

List of Tables vii

Table 20. Variable Overview ..................................................................................................... iii

Table 21. Survey Raw Data. ..................................................................................................... vii

Table 22. R Script Data. ............................................................................................................. ix

Table 23. Mean, Standard Error and Correlations. ..................................................................... x

Table 24. Levene’s Test for Homogeneity of Variance (center = median) for the variable age

over exp.gr. ............................................................................................................................... xv

Table 25. Levene’s Test for Homogeneity of Variance (center = median) for the variable pr.awa

over exp.gr. ............................................................................................................................... xv

Table 26. Levene’s Test for Homogeneity of Variance (center = median) for the variable

fb.friends over exp.gr. ............................................................................................................... xv

Table 27. Levene’s Test for Homogeneity of Variance (center = median) for the variable fb.use

over exp.gr. ............................................................................................................................... xv

Table 28. Levene’s Test for Homogeneity of Variance (center = median) for the variable

func.diff over exp.gr. ................................................................................................................. xv

Table 29. Levene’s Test for Homogeneity of Variance (center = median) for the variable educ

over exp.gr. ............................................................................................................................... xv

Table 30. Bivariate Regression for func.diffi with exp.gri as explanatory Variable. ................ xvi

Table 31. Bivariate Regression for func.diffi with pr.awai as explanatory Variable. .............. xvii

Table 32. Bivariate Regression for func.diffi with gen.malei as explanatory Variable. .......... xvii

Table 33. Bivariate Regression for func.diffi with educi as explanatory Variable. ................. xvii

Table 34. Bivariate Regression for gr.func.alli with educi as explanatory Variable. ............. xviii

Table 35. Multiple Linear Regression for func.diffi. .............................................................. xviii

viii

List of Abbreviations

API Application Programming Interface

Art. Article of the European Union General Data Protection Regulation

BD Big Data

DCB Data-Centric Business

DCS Data-Centric Services

DPIA Data Protection Impact Assessment

DPO Data Protection Officer

EU European Union

FB Facebook

GDPR General Data Protection Regulation

H Hypothesis

HTTPS Hypertext Transfer Protocol over Transport Layer Security

IA Information Asymmetry

IETF Internet Engineering Task Force

IS Information Systems

IT Information Technology

MH Moral Hazard

MHG Multihoming

MND Minimal-Necessary Data

List of Abbreviations ix

MSP Multi-Sided Platform

NGO Non-Government Organisation

NPO Non-Profit Organisation

OSN Online Social Network

PBD Privacy by Design

PD Personal Data

PGP Pretty Good Privacy

PIA Privacy Impact Assessment

PM Payment Method

PSB Privacy-Seeking Behaviour

PTB Privacy Trust Bank

Rec. Recital of the European Union General Data Protection Regulation

RS Reputation Services

RQ Research Question

SD System Dynamics

SM Social Media

SNO Social Network Operator

SNS Social Network Service

TSM Two-Sided Market

TSP Two-Sided Platform

x 1 Privacy in the Age of Social Network Services

UDD Unintended Data Disclosure

UGC User Generated Content

USA United States of America

W3C World Wide Web Consortium

XMPP Extensible Messaging and Presence Protocol

1

1 Privacy in the Age of Social Network Services

When Samuel D. Warren and Louis Brandeis wrote “The Right to Privacy” in 1890 and

provided the first concept of privacy as “the right to be left alone” [1], the world was almost a

century away from developing the internet. Later, in 1967, Westin presented a definition that is

still relevant when he explained privacy as “the claim of individuals, groups or institutions to

determine for themselves when, how, and to what extent information about them is

communicated to others” [2]. This was still approximately 30 years away from people finding,

friending, and sharing data with each other on social network services (SNSs) such as Facebook

(FB) on a large scale.1 In 2003, Westin added that privacy “also, involves when such information

will be obtained and what uses will be made of it by others” [5]. Whenever the word “privacy”

is used in this dissertation it relates to this composite definition of Alan F. Westin.

Since the first formulation of the privacy concept, much research has been conducted on the

topic. The academic literature platform Google Scholar provides over 4,580,000 articles and

books concerning the field of privacy.2 Furthermore, fiction books such as like Orwell’s

“nineteen eighty-four” addressed the subject [6]. When Edward Snowden leaked information

about the US global surveillance program in 2013, Orwell’s novel became popular again and

research focusing on internet privacy became relevant and received attention [7]. With the

“Snowden leaks”, the presentiment of some scholars that the massive collection of personal

data (PD) and analysis for behaviour patterns could be a threat to individual privacy became

certain [8]. Consequentially, researchers’ focus shifted from an ethical view including the

conceptualization of privacy and its relationship with other constructs, to threats associated with

the massive collection and analysis of data from varicose sources (big data (BD)), and the

related security of information systems (IS) [9, 10]. At the Davos Forum in 2015, Margo Seltzer

clarified the ongoing debate about a contemporary privacy conception, stating that “privacy as

we knew it in the past is no longer feasible” [11].

1 Depending on the definition, the introduction of a graphic user interface for the bulletin board system in the

early 1980s can be interpreted as a first and simple SNS [3]. However, the worldwide public rise of SNSs as

they are known today began at the earliest with sixdegrees.com in 1997 [4].

2 https://goo.gl/VNbUvX (last reviewed on 23.08.2017).

2 1 Privacy in the Age of Social Network Services

The recent research on internet privacy highlights the discussion of whether the GB Brexit vote

and USA presidential elections of 2016 were manipulated by the FB, claiming that companies

used disclosed PD from FB users for behavioural profiling, and then directly addressed the

needs and fears of those voters via SNSs with personalized advertisements [12]. However,

recent research has proven that the voting behaviour of SNS users can in fact be influenced by

FB [13]. A similar observation was made by data scientist Cathy O’Neil, who indicated that

“predatory ads” on SNSs use the same mechanism to sell overpriced loans or for-profit

university places to the most desperate individuals [14].

There are four motives for conducting research on privacy in SNS. The first motive is juristic

intention because privacy is determined as a fundamental human right [15]. Thus, all spheres

where privacy may be threatened must be considered important. The second motive is economic

because if the privacy needs of citizens are not met, their willingness to participate in SNSs and

do business online may decline with lasting negative results for the overall economy [16, 17].

In addition, some authors indicate that the increasing information asymmetry (IA) between SNS

users and companies using disclosed PD for BD may increase the unequal distribution of global

income and wealth and contribute to political tension [14, 18, 19]. The perception of

behavioural science is the third motive, because individuals have a need for privacy to explore

different identities and develop a sense of autonomy [20]. The fourth motive is political because

the algorithms of SNSs which present users only their preferred content (filter bubble effect) in

combination with the afore mentioned voting manipulations of FB users may lead to a

separation of political groups [21, 22]. This separation leaves smaller room for political

compromises, which are necessary for a well-functioning democracy. All in all, with more than

50% of internet users as active members of the largest SNS, FB, and with nearly half of the

worldwide population connected to the internet, SNS user privacy is relevant to at least one

fourth of humanity [23, 24].

This dissertation analyses the topic of privacy in SNSs from an IS and economic research

viewpoint. The aim is to illustrate privacy factors in the SNS environment and examine the

related dynamics of user privacy. As such, this thesis analyses whether the status quo of privacy

in SNSs is economically inefficient or leads to inefficiency, and whether governmental

regulation is required. Moreover, existing approaches to solving the privacy challenge in SNS

business are assessed. Before the research questions (RQs) are concretised at the end of this

chapter, some preliminary remarks are made and the essential concepts are explained in the

1 Privacy in the Age of Social Network Services 3

following sections. Concluding this chapter, publications that are directly and indirectly related

to this dissertation are listed for completeness.

1.1 Privacy Factors in SNSs and Their Environment

One translation of the word privacy is secrecy, which makes surveillance the opposite of privacy

[25]. Surveillance differs between state surveillance and commercial surveillance. State

surveillance is defined as the monitoring of citizens and other people by government institutions

for political and security reasons. In contrast, commercial surveillance describes the observation

of users and the analysis of their PD by businesses for commercial purposes [26]. This

dissertation focuses solely on commercial surveillance in SNSs and the connected problem of

PD disclosure and related privacy threats. To build the framework for the following chapters,

the fundamental terms are defined below and are concretised when necessary in the relevant

places.3

1.1.1 Personal Data

The current most advanced data protection legislation, the General Data Protection Regulation

of the European Union (GDPR), clarifies PD as follows. In Art.4.1, PD is defined as “any

information relating to an identified or identifiable natural person”. An identifiable natural

person, is one who can be “identified, directly or indirectly, in particular by reference to an

identifier such as a name, an identification number, location data, an online identifier or to one

or more factors specific to the physical, physiological, genetic, mental, economic, cultural or

social identity of that natural person” [34]. This definition declares every bit of data disclosed

via or about a natural person to PD, if it allows drawing conclusions about her and is not

anonymized in a non-recoverable way.

In the information society, higher capacities to analyse data and superior data sets can constitute

crucial competitive advantages for SNSs and other companies [35]. Among the different classes

of data, the World Economic Forum stated that PD is a “critical source of innovation and value”

[36]. The common metaphor of PD as “the new oil of the 21st century” by European

Commissioner Kuneva further illustrates the critical role that is attributed to PD in the economy

3 This subchapter contains a compilation of similar preliminary remarks from [27–33] in modified form.

4 1 Privacy in the Age of Social Network Services

[36]. Consequently, PD markets have emerged, where data subjects such as SNS users

participate as suppliers of PD, often without knowing who collects, transfers, and monetizes

the data related to them [37]. Moreover, economic value is not solely created by the PD that is

gathered but also by the BD analysis of it and the revealed inferences and patterns [38]. The

combination an individual’s PD with external data from various sources often reveals further

insights into that individual’s behaviour, preferences, and state of health. The conclusions that

are drawn can be used for manifold purposes with wide ranging implications for the individual

and society, including target advertisements in SNSs [39], disease spread predictions [40], and

political manipulation [12].

1.1.2 Social Network Service

The current major SNSs are complex ecosystems which include different participants. While

there are SNS users interacting with each other and disclosing PD, the current major networks

do not charge their users for this service in monetary terms [41]. However, the leading SNSs

such as FB and YouTube are for-profit businesses that rely on a steady income to finance their

running costs (e.g. servers and programmers). The necessary income is predominantly created

by aggregating and analysing the SNS users’ PD. The identified inferences are then used to

provide precise targeted advertisement space to business customers, representing the other side

of the SNS ecosystem [41]. This revenue model makes SNSs data-centric businesses (DCBs),

as defined by Müller et al. [42]. A simplified scheme of DCBs is depicted in Figure 1.

Figure 1. Schematic Representation of Transactions in DCB (Source: [42]).

1 Privacy in the Age of Social Network Services 5

Data-centric service (DCS) providers such as FB provide most services free of charge (e.g.

social networking) to users and generate revenue by allowing companies to present targeted

advertisements to these users. Thus, DCBs act as multi-sided platforms (MSPs) that cater to

users of their services and to advertisers [43]. These MSPs, also known as two-sided platforms

(TSPs) or markets, contain three main features [44]. According to Staykova and Damsgaard

MSPs:

(1) enable direct interaction between two or more participants affiliated with them,

(2) contain homing and switching costs for these participants, and

(3) include direct (same-side) and indirect (cross-side) network effects (see Figure 2).

Figure 2. MSP model, including platform participants (I&II), same-side network effects (1&4), and cross-side

network effects (2&3; according to [40]).

Homing costs are the possible costs including money, effort, time and other aspects for entering

and using an MSP. Switching costs are similar and occur for those participants switching from

one platform to another. Furthermore, network effects occur when the value of the platform or

its product for one user depends on the number of other users (of the same or another group).

These effects can be positive or negative and are key aspects of MSP [45].

In the context of SNSs, the homing costs occur for users in form of the time and effort they

invest in creating their profile, connecting with friends, and entering content into an SNS. On

the other side, the homing costs for advertisers include creating an equivalent advertiser profile

(e.g. an FB-Site) and gathering an audience for their products. The switching costs for users are

similar to the homing costs and include learning how to use a new network; the costs are the

same for the advertiser side. Finally, the positive, direct network effects for SNS users are clear,

as an SNS is only valuable for an individual if she can connect and communicate with her

friends within the network. In addition, there are strong positive indirect network effects

between the advertisers and the users, because an SNS becomes more valuable to an advertiser

6 1 Privacy in the Age of Social Network Services

as more people are available as a potential audience within the network. A more detailed

analysis of the effects within an SNS is provided in chapter 2.

Finally, SNSs are characterised by specific features that they provide on the users’ side. To

clarify the concept of SNSs, the definition from Kane et al. is used and slightly modified due to

the development of leading SNSs [46]. For the scope of this dissertation, an SNS is

characterized by the following services: “Users …

(1) have a unique user profile that is constructed by the user, by members of their network,

and by the platform;

(2) access digital content through, and protect it from, various search mechanisms provided

by the platform;

(3) can articulate a list of other users with whom they share a relational connection;

(4) and view and traverse their connections […]” [46].

Kane’s phrase “and those made by others” is excluded from the quotation and the given

definition because current major networks such as FB allow users to hide their friend lists from

others via their privacy settings and generally do not display it to third parties.

The compilation of the given classifications for DCBs, MSPs, and SNS features comprises the

complete description of the appearance of SNSs. Furthermore, this definition of SNSs is

equivalent to the common term online social networks (OSNs). It covers all major western

SNSs including FB, Twitter, YouTube, and app-based networks like Snapchat and Instagram.

However, FB is the main example for the following analysis, since it is currently the largest

SNS and the object of investigation for most scientific studies.

1.1.3 Social Network Operator

As mentioned above, SNSs are for-profit businesses run by professional companies, which are

further denoted by social network operators (SNOs). The following pages discuss the definition

of Buchman et al. for the term SNOs, where SNOs are characterised as companies which

“provide the underlying basic services […] and infrastructures, […] needed by users to interact

with each other“ [47]. Thus, an SNO is a company that maintains and provides the infrastructure

for an SNS. For the SNS of FB the company Facebook Inc. is the SNO and for the SNS of

YouTube, the SNO is Alphabet, the mother company of all Google products.

1 Privacy in the Age of Social Network Services 7

Advertising is the prevalent form of revenue generation for SNOs [41]. The economic objective

of PD analysis in this context is as follows: The detailed knowledge about the SNS users,

especially the consuming preferences, allows advertisers to aim for their target groups [49].

Thus, SNOs are dependent on SNS users revealing their PD to analyse their preferences for

target advertising, and on advertising customers paying to reach those users. They act as an

intermediary between the advertising clients and the SNS users.

In summary, SNSs are multi-sided-platforms with three different participants: the SNO, the

SNS users, and the business partners respective advertising customers [50]. Figure 3 illustrates

this triangular business interaction.

1.2 A Valid Privacy Definition for SNSs

The concept of privacy must be distinguished from that of data protection. For the SNS case,

data protection represents the protection of stored user data by the SNO against external and

internal attacks (e.g. via encryption). The notion of data protection thus reflects a paternalistic

approach where uninformed or uncaring users and their PD must be protected from malicious

attacks for their own good by the SNO [51]. This assumes that the SNO acts in its users’ interest

Figure 3. Triangular Business Interaction.

8 1 Privacy in the Age of Social Network Services

either because of its own economic benefits or because they are forced by correspondent

legislations. In contrast and according to Westins definition, privacy should enable individuals

to control their PD and its usage [2, 5]. Thus, the idea of privacy represents a self-determination

approach and is correspondently linked to the individual’s right to informational self-

determination [52].

A broad review is required to understand the range of privacy in IS for SNSs. A comprehensive

framework for privacy in IS development was provided by Carew and Stapleton in 2005 [53].

The following paragraphs are based on their work and supplemented with subsequent privacy

insight and theories to provide a state-of-the-art summary of privacy research. The history of

privacy types, functions, and categories is outlined.

As noted, the history of privacy typology dates back to 1967 when Westin introduced his

definition of privacy and identified four different privacy types, including solitude, intimacy,

anonymity and reserve [2]. Solitude stands for being alone and unobserved, intimacy represents

being exclusively organised as a small group, anonymity stands for being unrecognized in

public, and reserve states the preference to limit information disclosure to third parties. In 1982,

Burgoon published her concept of privacy, which included the four dimensions: social,

physical, informational, and psychological privacy [54]. Pedersen then extended Westin’s

typology by adding isolation and separating intimacy into intimacy with family and intimacy

with friends [55].

In addition to his typology of privacy, Westin outlined four functions of privacy: personal

autonomy, emotional release, self-evaluation, and limited and protected communication [2].

Personal autonomy represents self-identity and independence, and emotional release describes

the safe withdraw from social life, roles, norms, and duties. Furthermore, Altman listed three

functions of privacy, including interpersonal functions, the interface of the self and the social

world, and self-identity [56]. In addition, Pedersen empirically identified the following five

basic functions of privacy: contemplation, autonomy, rejuvenation, confiding, and creativity

[55]. In 1998, Newell stated that from a systems perspective privacy provides a chance for re-

stabilisation, self-maintenance and self-development for individuals [57].

Clarke developed his privacy categorization at the same time as Pedersen in 1999, and

sectioned privacy into the categories of privacy of the person, privacy of personal behaviour,

privacy of personal communication and privacy of personal data [58]. In 2008, Solove

1 Privacy in the Age of Social Network Services 9

published his taxonomy of privacy and related privacy threats to the four different types of

information collection, information processing, information dissemination, and invasion [59].

Due to technological progress, Finn et al. upgraded Clarke‘s privacy categorization in 2013 and

included, inter alia, the privacy of location and space as well as the privacy of association [60].

A compendium on privacy in SNSs and the corresponding threats including a characterisation

of the three different privacy conditions (awareness, control, and trustworthiness) was also

presented by the Acatech study in 2014 [47].

Given the characterization of SNSs and the outline of privacy research in IS, the following

definition for user privacy in SNSs was developed. For this dissertation and its analysis, user

privacy in SNSs is defined as:

(1) the capability of SNS users to control their PD and its collection, aggregation, analysis,

and possible transfer within the SNS and beyond, and

(2) the SNS users’ ability to optimize the amount of data disclosure and its security

against misuse with respect to their preferences.

This definition is applicable regarding all the privacy categories mentioned above and is

adequate for the subsequent chapters. A more detailed privacy framework is provided for the

evaluation of the GDPR in chapter 4.

1.3 Research Questions and Objectives

Since FB was published in 2004, the presence of SNSs has consistently increased. The daily

SNS usage of individuals continuously increased and SNS users contribute more and more PD

to their networks [61]. Moreover, since the beginning of the smartphone age, SNSs have even

been following people from their desktops to every other place they go. SNS encourage people

to share all information about their lives with their friends within the network and thus, with

the network. Consequently, Harvard Law professor Jonathan Zittrain deduced in 2008 that this

technology threatens “to push everyone towards treating each public encounter as if it were a

press conference” [62]. This thirst for PD can be explained with the business model of SNSs,

which heavily depends on gathering and analysing user data to deliver targeted advertisements

(cf. subchapter 1.1 and [39]). While the resulting markets created by SNSs and other data-driven

businesses with highly transparent data-subjects might be efficient [63], financial factors are

not the only aspects to consider in the debate on PD markets. The human rights aspect of privacy

10 1 Privacy in the Age of Social Network Services

must also be addressed to find a balance between economic efficiency and the data-subjects’

right to informational self-determination [64], because the extensive collection and analysis of

user data poses a severe threat to users’ privacy [65].4

Since this thesis exclusively addresses the challenge of privacy in SNSs approached from an

economic and IS viewpoint the subchapter 1.1 outlines the ecosystem of SNSs and shows that

these networks represent multifaceted economic businesses with several sides. Since SNOs act

as intermediaries and their networks connect together advertisers and users, an examination of

user privacy in SNSs must start with an economic analysis of the MSP character of social

networks, their effects, and the resulting dynamics regarding privacy. By contrast, existing

models of privacy in PD markets focus on restructuring the market to provide general solutions

(e.g. the three-tier model of Novotny and Spiekermann [66]). An understanding of the economic

participants, processes, and interrelations is critical to ensuing the microeconomic investigation

of individual decisions about user privacy, and to the macroeconomic analysis of market

dynamics and governmental regulation. The objective of the first research question (RQ1) and

the associated chapter 2 is to provide the economic structure of SNSs as MSPs, and outline the

connected influences on user privacy on a system level. Therefore, the question is formulated

as follows:

RQ1: What are the economic participants, architecture, government, and effects in the SNS

ecosystem at the system level, and how do they interrelate with user privacy?

To explore user privacy in SNSs, the next topic that must be investigated in economic

approaches is the individual level of SNS decision making. Various scholars have examined

privacy in SNSs and user privacy activities predominantly through behavioural economics [67–

69]. The results show that users value privacy, but do not act accordingly within SNSs and e-

commerce environments. This discovery was formulated as “privacy paradox” [70, 71].

However, SNS users exhibited privacy-seeking behaviour (PSB) under observation to limit

their PD disclosure to SNOs [72]. Although research attempts to estimate SNS users’ valuation

of privacy in monetary terms have also been unsuccessful [73–75]. However, it is clear that

privacy plays a part in the economic operations of online services such as SNSs [42, 63, 76,

77]. The research results of preceding scholars reveal a user privacy dilemma in SNSs: users

4 Passages of this subchapter originate from [29].

1 Privacy in the Age of Social Network Services 11

value their privacy, but they do not act appropriately, their PSB is not successful, and they are

not willing to pay a significant amount of money to maintain their privacy.

Chapter 3 and RQ2 illustrate the user privacy dilemma of SNSs to explain the economic aspects

of user privacy and its connected effects and problems on the individual level. It is necessary

to include not only the extraneous circumstances, but also the individual obstacles to privacy in

SNSs in the subsequent analysis of possible solutions. The purpose of RQ2, to model privacy

in economic terms, has been approached by other researchers. The most prominent attempt (i.e.

numbers of citations) is the “privacy calculus” model developed by Dinev and Hart, describing

consumer privacy decisions in e-commerce as rational weighing [78]. Chellappa and Shivendu

built upon the privacy calculus with a property rights approach to privacy to develop their

economic model of privacy in online businesses [79]. Furthermore, the three-tier model of

Novotny and Spiekermann contributing to solve the privacy controversy by assigning “clear

roles, rights and obligations for all actors to re-establish trust” should also be mentioned [66].

These authors have provided general models that describe why consumers decide to disclose

personal information, but do not focus on this problem in SNSs. Moreover, they illustrated how

the implementation of property rights for data might contribute to solving the accompanied

privacy problem. Specialising in social networks, Krasnova and Veltri extended the calculus

model in 2010 and adjusted it to user choices in SNSs, illustrating why users decide to self-

disclose in the social network environment [80]. However, the objective of this dissertation and

of RQ2 is to deviate from the scientific research track with regard to outlining user privacy in

economics and provide a unique approach. The aim is to demonstrate that the PD disclosure of

SNS users is an economic transaction and the related privacy problems can be addressed with

classic economic models. Suitable economic solutions for the privacy challenge in SNSs and

its evaluation can then be derived. Consequently, RQ2 is formulated and partitioned as follows:

RQ2.a: Is SNS users’ PD disclosure interpretable as an economic transaction?

RQ2.b: Can the user privacy dilemma of SNSs be modelled in classic economic terms?

In addition to the privacy dilemma in SNSs, empirical data and the current economic discussion

suggests that the SNS market is moving towards monopoly conditions [81, 82]. This condition

is considered to need market intervention according to the classic economic theory, because it

cannot induce an efficient equilibrium [83]. This intervention can be from governments and

higher authorities through different provisions and regulations. An example of an approach to

12 1 Privacy in the Age of Social Network Services

improving user privacy and regulating the digital PD market is the GDPR [34]. The European

legislation has been examined by various scholars regarding its economic implications for the

affected businesses and its influence on the general level of data protection [84–88]. Yet, it is

unclear whether the GDPR can eliminate the causes of the user privacy dilemma in SNSs or

break the dynamics of the SNS market which threaten user privacy and drive it towards a

monopoly. Chapter 4 is therefore dedicated to answering the corresponding RQ3:

RQ3: Does the GDPR provide a regulatory solution for user privacy in SNSs, and what is its

influence on the SNS market?

The classic economic theory presents competition as a solution to achieving an efficient market

equilibrium [83]. Thus, it is likely that increasing SNS market competition is an answer to the

user privacy dilemma and the monopolistic tendencies. The impact of competition between

online platforms and services on consumer privacy has been addressed by scholars. Bonnneau

and Preibusch found empirical evidence of “vigorous competition for new users” between

different SNSs in 2010, but they concluded that the connected market for privacy in SNSs was

“dysfunctional” [89]. On a theoretical basis, Casadesus-Masanell and Hervas-Drane deducted

that the current SNSs will surpass their competitors if they choose to generate their revenues

by analysing PD, if users’ willingness to pay for privacy is relatively low [90]. In addition,

scholars of the HU Berlin theoretically demonstrated that SNSs will mine and use more PD as

their market position becomes stronger [91]. The objective of RQ4 is to build on these and other

findings about competition in the SNS and in the MSP market to draw conclusions about their

impacts on user privacy and the dynamics of the SNS market. Furthermore, those results were

matched with empirical market evidence, and potential interventions to increase competition

for user privacy in the SNS market were assessed. Building upon that approach, chapter 5 is

dedicated to answer the resulting two-part RQ4:

RQ4.a: Can competition enhance user privacy in SNSs?

RQ4.b: What interventions can direct and enhance the SNS market dynamics?

1 Privacy in the Age of Social Network Services 13

1.4 Outline and Contributions

As presented previously, the dilemma of user privacy in SNSs has been addressed by various

researchers from different scientific fields. The problem area of user privacy affects behavioural

economics as well as the classic microeconomics and macroeconomics; computer science, IS,

and law research are also addressing the privacy problem. Consequently, the analysis and

possible solutions for the dilemma of user privacy in SNSs must follow a holistic approach.

This dissertation is predominantly allocated to the fields of IS and economics, and the IS

research field is divided into two different streams: a design science approach which emphasises

the conception and evaluation of IS artefacts, and a behavioural approach which focuses on the

evaluation and prediction of human, and societal reactions to technology implementation [92].

According to this determination, this thesis follows the behavioural methodology of IS research.

Therefore, an interdisciplinary methodological approach is used to address the RQs, drawing

on methods from IS, economic, and legal research. This dissertation provides an in-depth

analysis of the user privacy problem in SNSs and contributes to the corresponding research

fields by modelling the privacy dilemma, and illustrating and evaluating possible economic

solutions to enforce user privacy in SNSs.

Figure 4 provides an outline of this dissertation and briefly summarises the chapters assigned

to the respective RQs. This thesis is structured logically along the RQs and analyses the system

structure of the SNS market on user privacy, before addressing privacy in SNSs on an individual

level by economic means. Moreover, the GDPR is addressed regarding its influence on privacy

and the corresponding market dynamics, representing governmental regulation of the SNS

market. User privacy as a competition factor in the SNS market is then analysed and potential

interventions to change the market dynamics in favour of user privacy are assessed. Finally, the

overall results of the dissertation are explained.

14 1 Privacy in the Age of Social Network Services

Figure 4. Outline of the Dissertation.

Chapter 6: Does Economic Competition & Regulation provide a Solution to Privacyin SNS?Chapter 6: Does Economic Competition & Regulation provide a Solution to Privacyin SNS?

The last chapter concludes the dissertation as a whole and provides an outlook for possible directionsof future research in this area.

Chapter 5: Social Network Services: Competition and PrivacyChapter 5: Social Network Services: Competition and Privacy

Tackles RQ4.a with its first subchapter and outlines that free competition between SNSs containsmainly user privacy reducing elements. The second subchapter is dedicated to RQ4.b and assessespossible interventions to change the competition dynamics between SNSs towards user privacy as acompetitve factor, resulting that interoperability would be best suited to do so.

Chapter 4: SNS Privacy and RegulationChapter 4: SNS Privacy and Regulation

Examines RQ3 and analyses the GDPR representatively for regulation on the SNS market. Results arethat the GDPR is able to increase user privacy in SNSs but fails to change the privacy harmfull dynamics.

Chapter 3: Economic Modelling of Privacy in SNSsChapter 3: Economic Modelling of Privacy in SNSs

Adresses RQ2.a in the first subchapter via a case study showing that PD disclosure by users can beinterpreted as payment method in SNSs and revealing that SNS users are objected to informationasymmetry. Accordingly, the second subchapter tackles RQ2.b models those information and powerasymmetries via contract theory.

Chapter 2: Interrelations on Privacy in the SNS Platform EnvironmentChapter 2: Interrelations on Privacy in the SNS Platform Environment

Answers RQ1 by modelling the participants, architecture, government, and effects of the SNSenvironment as well as their impacts on user privacy via System Dynamics Modelling. In doing so thechapter furthermore provides the second research frame for SNSs and reveals privacy harmfull aspectsof the SNS market dynamics.

Chapter 1: Privacy in the Age of Social Network Services Chapter 1: Privacy in the Age of Social Network Services

Provides the first research frame by determining SNSs, their MSP charakter, factors and participants aswell as a state-of-the-art privacy definition. Sets out the RQs and their objectices and presents a list ofthe author's publications.

1 Privacy in the Age of Social Network Services 15

Chapter 1 builds the first framework for examining user privacy in SNSs. It provides the

required definitions of SNSs, SNOs, PD and privacy, as well as the RQs to pursue the

investigation of user privacy in the following chapters. Furthermore, it illustrates the MSP

character of modern SNSs, and outlines the involved parties and their economic interests. The

definitions and the analysis of the MSP character of SNSs are a compilation of corresponding

sections from the papers listed in Table 1.

Based on a multidisciplinary literature review of SNSs, MSPs, and privacy markets, Chapter

2 describes a detailed system dynamics (SD) model of user privacy in SNSs. It provides the

answer to RQ1 by modelling the participants, architecture, government, and effects of the SNS

environment as well as their impacts on user privacy. The SD model illustrates that SNS users

are subjected to switching costs and lock-in effects which drive the SNS market towards

monopoly conditions. This chapter outlines the setting for the following analysis and is based

on the publication “Coherences on Privacy in Social Network Services” presented at the IFIP

Summer School 2016 in Karlstad, Sweden [30].

Chapter 3.1 provides the answers to RQ2.a about whether the disclosure of PD to SNOs can

be interpreted as a method through classical economic theory. It illustrates that user privacy in

SNSs can be understood as an economic problem and argues that the disclosure of PD in

exchange for SNS usage characterises a payment method (PM). In a case study that includes an

experimental part, it is shown by the statistic method of regression analysis that users lack

transparency to evaluate the amount of disclosed PD and the power to set their privacy

preferences. This chapter was presented and published as a full research paper “Personal data

as payment method in SNS and users’ concerning price sensitivity – A survey” at the

International Conference on Business Information Systems 2015 [28].

Chapter 3.2 supplies the equivalent answer to RQ2.b about whether the dilemma of user

privacy in SNS can be modelled in economic terms. It models the privacy problem with the

microeconomic contract theory, including the economic interests and motivations of the

participants and the distinction of three different market cases. It illustrates the information and

power asymmetries between SNS users and the SNO, and demonstrates the user privacy

dilemma. Furthermore, this chapter provides initial approaches to possible solutions for the

dilemma. This chapter was presented and published as a full research paper under the title

“Towards Balancing Privacy and Efficiency: A Principal-Agent Model of Data-Centric

Business” at the International Workshop on Security and Trust Management 2015 [29].

16 1 Privacy in the Age of Social Network Services

Chapter 4 addresses RQ3 about whether the GDPR as a governmental intervention can provide

a solution to the user privacy dilemma in SNSs. It carries the definition of privacy from

subchapter 1.2 forward to provide a framework for evaluating the legislation regarding its

privacy coverage. In the second step, the GDPR is consulted as the current most advanced

privacy legislation and analysed based on this framework. The legal and technical obstacles of

the regulation are also assessed and its economic impact is outlined. It is argued that the GDPR

may increase user privacy for SNSs but only to a certain level. Furthermore, the legislation

cannot change the privacy threatening dynamics of the SNS market and its drive towards

monopolistic conditions. This chapter was submitted to the Journal of Business & Information

Systems Engineering under the title “The General Data Protection Regulation Impact on Privacy

– An Assessment for SNS” and is currently under review [32].

Subchapter 5.1 builds upon the previous results to address RQ4.a about whether competition

can enhance user privacy in SNSs. It reviews theoretical and empirical results from MSP

competition research and matches them with empiric evidence from the SNS market to deduct

current and future development of user privacy. This analysis indicates that competition in the

SNS market is detrimental to, rather than reinforcing of, user privacy. Furthermore, it supports

the findings than the SNS market is driving towards monopolistic conditions. This chapter was

presented and published as a full research paper “Social Network Services: Competition and

Privacy” at the Wirtschaftsinformatik Conference 2017 in St. Gallen [31].

Pursuant to RQ4.b, the impact of the market-influencing parts of the GDPR and additional

regulatory interventions on the competition for user privacy are assessed and evaluated in

Subchapter 5.2 using the results of the previous chapters. It is revealed that interoperability in

combination with data portability represent the most promising attempt to change the SNS

market dynamics towards more user-privacy-friendly competition. This chapter was submitted

to the Journal of the Association for Information Systems under the title “Options to enhance

Privacy Competition in the Social Network Market” and is currently under review [33].

Chapter 6 concludes this dissertation by providing a summary of its outcomes and discussing

the resulting implications. Subchapter 6.2 then presents possible directions for future work.

1 Privacy in the Age of Social Network Services 17

1.5 Remarks on Publications

This dissertation does not constitute a thesis pursuant to the classical monograph manner; it is

a monograph thesis with a strong cumulative character. While perusing the research of user

privacy in SNSs for this dissertation, interim findings which now constitute essential parts of

this thesis were submitted or are published as full research papers in different proceedings and

presented at international conferences. At the outset of each affected chapter, the corresponding

information is provided as a footnote. Furthermore, a complete list of direct and indirect related

publications is depicted below in Table 1:

Publications (double-blind peer-reviewed)

[33] Nolte, C. G. (2018). “Options to enhance Privacy Competition in the Social Network Market”. In

Journal of the Association for Information Systems – submitted for review.

[32] Nolte, C. G. & Rosenberg, B. (2018). “The General Data Protection Regulation Impact on Privacy –

An Assessment for SNS”. In Business & Information Systems Engineering – submitted for review.

[31] Nolte, C. G., Schwarz, J., & Zimmermann, C. (2017). “Social Network Services: Competition and

Privacy”. In Wirtschaftsinformatik 2017 Proceedings.

[30] Nolte, C. G., Brenig, C., & Müller, G. (2016). “Coherences on Privacy in Social Network Services.”

IFIP Summer School 2016. Karlstad, Sweden, 21-26 Aug. 2016.

[29] Zimmermann, C., & Nolte, C. G. (2015). “Towards Balancing Privacy and Efficiency: A Principal-

Agent Model of Data-Centric Business”. In International Workshop on Security and Trust Management

(pp. 89-104). Springer International Publishing.

[28] Nolte, C. G. (2015). “Personal data as payment method in SNS and users’ concerning price sensitivity

– A survey”. In International Conference on Business Information Systems (pp. 273-282). Springer

International Publishing.

[27] Nolte, C. G., Zimmermann, C., & Müller, G. (2015). “Social Network Services' Market Structure

and its Influence on Privacy”. Amsterdam Privacy Conference 2015. Amsterdam, Netherlands, 23-26 Oct.

Table 1. Direct and Indirect Related Publications.

19

2 Interrelations of Privacy in the SNS Platform Environment

This chapter addresses RQ1 and takes the unique approach of a System Dynamics (SD) analysis

to understand and model the complex interrelations within SNSs and their environment on

privacy. Studies on privacy were analysed and combined with results of MSP research in an

interdisciplinary literature review. Based on this review, a qualitative system dynamics model

of SNSs was designed to comprehend factors that directly or indirectly influencing user privacy.

Analysing the model demonstrated that the main source of decreasing user privacy is

unintended data disclosure (UDD). Furthermore, the only elements that directly sustain privacy

are efficient privacy controls and raising user privacy awareness. The most notable result is the

multihoming feedback-loop, where user multihoming (MHG) behaviour increases market

competition for the time users spend on a network and prompts SNOs to include more platform

features. This feedback-loop leads to a rise in user data disclosure and an increase in user costs

in time and effort to switch to another network, which might result in a winner-takes-all

outcome for the social network service market in the long term.5

The EU Commissioner for Competition, Margrethe Vestager, warned that “if just a few

companies control the data [...], that could give them the power to drive their rivals out of the

market” [93]. As Evans and Schmalensee found, the majority of DCBs that aggregate and

control this data are constituted as SNSs [94]. In addition, the research of Evans and

Schmalensee on SNSs as MSPs reveals an interrelation between the positive direct network

effects among SNS users, which rise with an increasing user base and increasing market power

[95]. These insights lead to the main research question RQ1 for this chapter: What are the

economic participants, architecture, government, and effects in the SNS ecosystem at the

system level and how do they interrelate with user privacy?

Consequently, the aim of this chapter is to shed light on the darkness of the SNS platform

structure. The objective is to model SNSs as MSPs to investigate the different direct and indirect

influences on user privacy and their impacts on all platform dynamics. Therefore, a qualitative

SD analysis based on an interdisciplinary literature review was conducted.

5 This chapter includes and extends the paper [30].

20 2 Interrelations on Privacy in the SNS Platform Enviroment

This chapter is structured as follows: Initially, the literature review is presented to provide an

overview of the current MSP and SNS research, discussing the most relevant and controversial

findings. Building on that, coherencies and feedback-loops that are relevant to SNSs are

extracted and modelled with a bottom-up approach using SD. This is done using the findings

from the literature analysis to create an SNS core model containing the fundamental platform

parts. In the second step, further results in the form of the SNS periphery are implemented into

the SD model to complete the exposition (see section 2.1.1). The crucial and controversial parts

are then examined, including the implications for SNS user privacy to match them with

evidence from other empirical studies, market statistics, and media references. Finally, the

major findings and their implications are summarized and discussed.

2.1 A Framework for SNSs as Platform Businesses

Maxwell stated that a conceptual framework “is a simplification of the world […] aimed at

clarifying and explaining some aspect of how it works” and often based on a corresponding

literature review [96]. Consequently, an interdisciplinary literature review was conducted.

According to the methodology of Rowley and Slack, the review began with a brief search and

building on citation pearl growing on the results from the former [97]. The scope of the review

included SNS-user behaviour, internet privacy, and advertising on SNS and MSP. The aim was

to identify economic and behavioural interrelations and feedback-loops within certain SNSs,

and the SNS market in general. Therefore, the focus was primarily, but not exclusively, on the

impact of network effects and MHG from an MSP perspective on the SNS environment.

Furthermore, building upon that objective, the elaborated interrelations, including cross-border

connections, are modelled with system dynamics (SD) to visualize and thereby clarify

economical and behavioural interdependencies. The advantage of the SD approach is that it

targets the analysis of information feedback: “It treats the interactions between the flows of

information, money, orders, materials, personnel, and capital equipment in a company, an

industry, or a national economy” [98]. Thus, the method is appropriate to target the complexity

of the SNS market and to combine the findings of studies on behavioural economics regarding

user behaviour within SNS with the theoretical as well as empirical results from both general

and SNS-focused MSP research.

2 Interrelations on Privacy in the SNS Platform Enviroment 21

The literature review focuses on influential work from the last 15 years on the topic of MSPs,

SNSs, or general privacy in the internet age. The literature was identified by performing a brief

search with Google Scholar and the Freiburg University library catalogue for the keywords

“multi-sided platforms” and “two-sided markets”, as well as “(online) social network

(services)” and “Facebook”, alone or in combination with “privacy”. Citation pearl growing

was then performed to extend the literature collection. Only papers that contained results

directly connected or transferable to SNSs or the SNS environment were kept and additional

appropriate literature were identified from these resources. The review results were then sorted

by the topics of MSPs, SNSs, and privacy, as well as the categories of framework, discussion,

modelling, and empirical work. A framework is an overall consideration and structure of a topic,

while a discussion considers certain research results and modelling means generating a

theoretical model. Empirical work involves evaluating of surveys and market statistics. Finally,

the result was a database of 41 papers matching the research purpose (cf. Table 2).

Category / Topic MSP SNS Privacy Sum

Framework 1 1 0 2

Discussion 5 7 4 16

Modelling 5 3 1 9

Empirical work 2 9 3 14

Sum 13 20 8 41

Table 2. Literature Review Distribution.

2.1.1 Approach

The following structure was used while analysing the papers and determining the results which

were based on the MSP framework of Staykova and Damsgaard [44], to identify the

components for the SD modelling. First, four different SNS layers were distinguished:

1. Platform Participants: participants in the SNS environment (e.g. SNO, users, and advertisers),

including competing SNSs.

2. Platform Architecture: the core features of an SNS platform as well as subsequently integrated

features including apps and acquired services.

3. Platform Governance: possibilities of platform access, interactions, pricing policies in

monetary and non-monetary forms, and privacy policies.

4. Platform Effects: the different effects between participants caused by the platform

architecture or governance.

22 2 Interrelations on Privacy in the SNS Platform Enviroment

The evolutionary SNS approach of Staykova and Damsgaard was followed by the analysis [44].

The SNS core was determined first, consisting of the primal and fundamental features,

participants, and effects of an SNS, followed by the SNS periphery consisting of all additional

participants, features, and resulting effects (see Figure 5). This approach was used to build the

SD model.

Figure 5. SNS core and periphery illustration loosely based on [44].

2.1.2 Platform Participants

According to the analysis of Staykova and Damsgaard, the market-dominating SNS, FB, started

as a one-sided platform and evolved into an MSP [44]. This strategy was also used by many

other SNS and this finding is supported by the study of SNS history by Ellison [4]. Thus, the

core participants of an SNS are but the users and the provider (aka SNO).

Regarding to the SNS periphery (cf. Figure 5), the first participant to recognize are competing

platforms, a factor which is crucial for most MSP considerations [81, 99, 100]. Since the leading

income source for an SNS is advertisement [39, 44]. Other important participants are

advertisers [4, 39, 44, 49, 101]. In addition, FB integrated an application programming interface

(API) to motivate outside developers to enlarge their SNS features with apps and games. App

developers are therefore important participants in the SNS periphery [44]. Furthermore, SNS

features seem to play a significant role in users valuing an SNS [44, 46, 67, 72, 102] and

spending time on it [72], and one important competition between different platforms is about

the time users spend with them [31, 102].

2 Interrelations on Privacy in the SNS Platform Enviroment 23

2.1.3 Platform Architecture

The core architecture of an SNS contains the basic features, including the opportunity for users

to create a profile, enter user-generated content (UGC), build a network with other users, and

message other users [44, 46, 102].

In the periphery, additional SNS features such as event management, group-specific messaging

and boards, external apps and games, overtaken and included services (e.g. Instagram in the

case of FB), and identity management services (e.g. FB Login) can be found. For simplicity,

these features were all combined in a simple, quantitative proxy for the SD analysis. The

periphery also contains the most important component for monetizing the platform: the ability

to show targeted advertisements to the users, and the interface for advertisers to purchase these

advertisement spaces [44, 101]. Finally, user privacy controls are found in the periphery

because they are generally implemented at a later stage of the SNS when the basic features have

already attracted a high number of users [4, 103].

2.1.4 Platform Governance

There are two regulations to observe regarding the governance of the platform core. First, the

access policy of an SNS determines who is allowed to join the platform and under what terms

[44]. For example, FB only allowed students to join their platform in the early stage and ensured

this policy by approving only applications with a university email address [4]. Second, the

pricing policy was identified as a core element of platform governance. Pricing policy,

describes the monetary fees collected in a platform as well as non-monetary payments such as

the right to gather and analyse PD for commercial purposes. In this regard, all successful SNSs

have chosen to grant free access and gather PD for targeted advertisement as their prevalent

business model [4, 39, 44, 46].

Another part of the governance are the interactions provided and allowed between the different

platform participants. For the platform core, only the interactions between the platform users

are considered as essential for the SNS to gain UGC (cf. [44]). Moreover, the gathered user data

and users’ interest in privacy and control lead to the implementation of privacy policies, to

transparently illustrate the data use to users and improve their trust via privacy control options

[46, 72, 77, 104–107]. However, few SNS users seem to understand and use them [28, 68, 108].

24 2 Interrelations on Privacy in the SNS Platform Enviroment

Examining the platform periphery, demonstrated that the access policy is enhanced by

regulations not only for advertisers as expected, but also for app developers when an open API

is integrated. For some SNSs identity management tools also become part of their access policy

by implementing login services (cf. FB, Google+ or Twitter sign-in-services). Once the access

policy is extended, the pricing policy must be extended too, since advertisement space on SNSs

is usually auctioned to the highest bidder for a certain keyword or target group [49]. App

developers must share their revenue from a platform with the SNOs and the user often pays for

using the sign-in services with the disclosed information.

The possible interactions furnished by the SNOs extend into the periphery. Platform users can

interact with each other and with celebrity and company fan-sites and their advertisements, use

apps integrated by external app developers, and use their SNS profile as an identity management

tool to sign-in to external services and websites. Likewise, companies can run advertisements

for specific target groups, and are also allowed to create confirmed company profiles to provide

events or special offers through the SNS to their followers. External app-developers can collect

money through their apps to gain profit, acquired services are integrated into the platform, and

their UGC and users coalesce with the original SNS (cf. Facebook & Instagram or WhatsApp).

2.1.5 Platform Effects

In this section, the resulting effects between the above-mentioned platform participants and

different layers are determined. The platform core is first considered and then enhanced with

the periphery. The positive direct network effects between different users of an SNS are

identified, meaning that each user makes the platform more valuable for other users by her

membership and by contributing more UGC, resulting in a higher adoption rate [109–111]. A

higher amount of UGC and higher platform activity correlates with a higher quantity of

unintentionally disclosed data (UDD) [62, 72], which decreases users’ privacy. The number of

SNS features also has a positive influence on the UGC and on the time users spend within a

network [67, 72, 102], while the spend user time has a positive influence on user expertise with

the SNS. Finally, implementing privacy policies and controls has a positive effect on users’

privacy and on their trust in an SNS and their awareness of data usage by the SNO [46, 69, 77,

112, 113]. Increased trust can enhance users’ willingness to add UGC [103, 105, 114], but may

also have the opposite effect if they become suspicious about the data claims of their SNO [29].

2 Interrelations on Privacy in the SNS Platform Enviroment 25

With the integration of additional platform participants, the SNS periphery adds more notable

effects to the overall image. Indirect network effects resulting from the users make the platform

more valuable for advertisers and app developers, and have a positive influence on their

adoption rate [44, 81, 94, 115]. While the creativity of the app-developers and the newly created

SNS features seem valuable for SNS users [67, 102], the user acceptance of advertising is still

not clarified. Knoll notes that SNS users generally accept advertising “as long as it keeps a

valued service free of charge” [101], whereas Tucker suggests that social advertising in SNS

can “backfire” and should be flanked with adequate privacy controls [104, 113]. Other

researchers have reported to similar findings [72, 76, 114, 115].

With competitive SNSs, the effect of multihoming (MHG) arises when users utilise various

SNSs simultaneously [81]. Findings in research about MHG implications are manifold and, in

some cases, contradictory. Haucap and Heimeshoff suggest that MHG for SNSs is “principally

easy” [81], and Hyytinen and Takalo found that consumer awareness enhances MHG [116].

These assumptions are supported by recent statistics showing that more than half of all online

adults in the US use at least two SNSs [61]. Nevertheless, Mital and Sarkar argued that smaller

networks are only attractive for MHG with exclusive content [117], and Choi supported this

position by asserting that MHG can only be welfare enhancing if there is exclusive content for

each platform [118]. Moreover, Zhang and Sarvary mention that MHG leads to an overlap in

content and a winner-takes-all equilibrium [111]. These outcomes lead to the expectation of

rising competition between different SNSs for users and their generated content which

contradicts the findings of Doganoglu and Wright, who argued that MHG weakens competition

[119].

In theory, competition for users will lead to a greater regard for user interests by SNSs. This

conjecture is partly supported by Eisenmann et al. in 2006, who note that users, as the more

price sensitive side of the platform, are subsidized in the case of MHG [109]. Users are

considered the more price sensitive side because they are not willing to pay a significant amount

of money to use an SNS that maintains their privacy [120], and users are also price sensitive

regarding the disclosure of PD (cf. subchapter 3.1). In contrast the results of Rochet and Tirole

(2003), MHG on one platform side will intensify price competition on the other side [99]. This

is supported by the findings of Armstrong in 2006, who found that the single homing side will

be treated well and the MHG side’s interests will be ignored by the platform provider [100].

26 2 Interrelations on Privacy in the SNS Platform Enviroment

However, this subproblem about the MHG behaviour of advertisers in SNSs is not pursued in

detail in this chapter, but is approached in chapter 5.

2.2 System Dynamics Modelling of the SNS Environment

In this chapter, a system dynamics (SD) model for SNS will is described, which results from

the literature analysis. This task is divided into two steps, including the development of the core

model containing only the fundamental platform participants, architecture, government and its

resulting effects (cf. subchapter 2.1.1), to explain the interrelations in SNS and their influence

on user privacy. A more complex SD model was also created, to illustrate the SNS periphery

and its influence on user privacy.

The box variables of the core model are blue to differentiate them from the black elements of

the periphery. The positive connections of the core model are also blue unless they cut across

other elements, in which case they are green. Moreover, the negative connections are red for

the core model, orange for the periphery model, and pink in the case of overlap (cf. Figure 6 &

Figure 7).

2.2.1 The Core

Building the SD SNS core model (see Figure 6), the literature review reveals the core containing

only two participants: the SNO and the users. For simplicity, an own variable for the SNO is

spared in the model and only the architectural items are included, as well as regulations and

effects proceeding from it. The users are modelled in two variants: the variable of Potential

Users which represents all internet users who are not part of the SNS but could join it, and the

variable of Users which denotes the actual users of the SNS. Both relate to the Adoption Rate

U, indicating the rate at which people are joining the SNS. The quantity of users has a positive

influence on the adoption rate, representing the positive, direct network effect of the user to

potential users. Furthermore, the Adoption Rate U has a negative influence on the Potential

User showing that as more people join the network, there are fewer potential users.

2 Interrelations on Privacy in the SNS Platform Enviroment 27

As architectural item the variable SNS Features is included, representing the quantity of all

features available for SNS users.6 The more features that an SNS provides, the more time users

spend within the platform [102]. This is represented by the variable Time Users Spend in SNS.

The more features they use the more User Generated Content (UGC) they contribute. In

addition, the more time users spend on an SNS, the more often Unintended Data Disclosure

(UDD) occurs because most users are not aware that all their actions on the SNS are being

tracked [28, 121]. This variable stands for the quantity of data disclosed unknowingly to the

SNS by the users and provides a proxy for the loss of privacy. The same quantitative correlation

also exists with the total number of users; the more users that an SNS has, the more UGC is

created within it. The Privacy variable is also included for reasons of clarity and

comprehensibility, and UDD has a negative impact on privacy.

The SNS governance variable of Access Policy has a negative influence on the Adoption Rate

U, representing the opportunity for an SNO to limit the SNS membership to certain groups (e.g.

students) or to specific conditions (e.g. full real name registration). If there are no restrictions

to the SNS access, the variable Access Policy is zero, as is its influence on the adoption rate.

The other governance variable in the SD core model is Privacy Policy & Controls, representing

the implementation of a transparent privacy policy and privacy controls that enable users to

control the visibility and use of their data. The Privacy Policy & Controls variable has a direct

negative influence on UDD, furthermore, this variable has a positive influence on the users’

trust in the SNS, represented by the Trust variable, which has a positive influence on the UGC

[105]. As mentioned above, if no privacy policy and privacy controls are implemented by the

SNO, the variable of the Privacy Policy & Controls is zero, as its influence. The last variable

included in the core model is User Awareness, indicating how aware users are on average of

the possible UDD and general privacy problem. User Awareness has a negative influence on

UGC, as privacy-aware users use fewer SNS features and contribute less UGC to avoid UDD

(cf. subchapter 3.1).

The SD model shows that the main negative influence on user privacy for the SNS core

(characterized by the positive influences on the UDD variable) is the pure quantity of UGC,

which is caused by the number of SNS users and features and is supported by users’ trust in an

SNS (see Figure 6). Another source of UDD is the time users spent on the SNS. Apart from

6 Considering the influence of additional features, it is assumed that all features are at least implemented as

usability neutral and do not corrupt the user value of the SNS.

28 2 Interrelations on Privacy in the SNS Platform Enviroment

that, the only positive influence on user privacy seems to be User Awareness and the

implementation of Privacy Policy & Controls, at least for the SNS core.

Figure 6. SD SNS Core Model.

2.2.2 The Periphery

Three new platform participants were identified by extending the core model for the periphery

(see Figure 7). First, the Advertisers, Potential Advertisers, the Application Developers, and

the Potential Application Developers are all represented by identically named variables.

Similar to the user variable construct, adoption rates exist in this construct. Adoption Rate A of

the advertisers is positively influenced by the quantity of users, as is Adoption Rate AD of the

app developers, while the quantity of the Application Developers increases the quantity of SNS

Features. Moreover, Adoption Rate A is positively influenced by the Time Users Spend in an

SNS and by the new variable of Profiling Capabilities, which stands for the SNO’s ability to

profile and target specific user groups for advertising. Which is driven by the quantity of UGC

and UDD. Finally, the implementation of Advertisers has a positive effect on User Awareness

because SNS users apprehend data transfer to third parties [113].

2 Interrelations on Privacy in the SNS Platform Enviroment 29

The last platform participant is the variable of SNS competitors, which represents other SNSs

competing for the same users, advertisers, and app developers. This variable has a positive

influence on the Competitive Pressure in the markets; it positively influences MHG and vice

versa. High Competitive Pressure indirectly increases the number of implemented SNS

Features because the SNO wants users to spend more time in the own SNS [102], which leads

to an increase in the last variable for the periphery: User Expertise. This variable represents

how experienced users are at dealing with the SNS and, reflects the switching costs in terms of

effort to switch to another SNS or use it simultaneously. By increasing the switching costs, the

User Experience has a negative influence on MHG, which closes this feedback-loop. The

positive effect of User Awareness on MHG [116] then completes the full SD SNS model (see

Figure 7). Due to contradictory findings regarding the effects of advertising on SNS user

behaviour (cf. subchapter 2.1.5), none of those effects were implemented. Likewise

controversial impacts of MHG were not implemented.

Focusing on Privacy and its negative influences on UGC and UDD, the periphery provides

several new insights. First, the UGC and UDD both positively influence the SNS Profiling

Capabilities which are needed by the SNO to maintain and extend its market position for

advertisers, especially in the face of strong Competitive Pressure. Second, MHG negatively

influences privacy due to its positive effect on UGC as well as its indirect positive effects on

Competitive Pressure. The inclusion of App Developers also negatively influences privacy due

to the App Developers positive effect on SNS Features and on UGC.

30 2 Interrelations on Privacy in the SNS Platform Enviroment

Figure 7. SD SNS Core & Periphery Model.

2 Interrelations on Privacy in the SNS Platform Enviroment 31

2.3 Insights from the SNS SD Model

In the following section, the results from the SNS SD model that was constructed from the

findings of the literature review are discussed and matched with theoretical and empirical

evidence. The limitations of the review and of the model are also considered.

2.3.1 Core Model Results

As derived from the core model (cf. subchapter 2.2.1), the only negative cause of user privacy

in SNS is UDD, which occurs while users contribute any kind of UGC to the SNS or when they

browse it. These result aligns with other empirical and theoretical findings (cf. subchapter 3.1)

and other research showing that users constantly underestimate the amount of data which is

tracked and gathered by SNOs during the direct or indirect usage of their services [68, 69, 72,

106, 121]. Furthermore, it fits to the perception that SNS environments push users to reveal

more PD and thereby overcome users’ initial PSB [62, 72, 112].

Besides the general user awareness and the related user restraint when contributing UGC, the

only source that seems to limit the UDD is the implementation of privacy policies and controls,7

which is also consistent with theoretical findings (cf. subchapter 3.2). Implementing privacy

policies and controls is also beneficial for the SNO because it increases users’ trust in the

network and motivates them to contribute more content. This assumption is supported by the

latest implementation of privacy controls by the leading platforms [122, 123]. Thus, the trust

effect and increase in usage and UGC might compensate for the decrease of UDD, though this

conjecture requires research. The influence of direct positive network effects due to the user

amount on the user adoption rate of SNS is clear. Recent statistics show that the largest SNS,

FB, continues to grow and support this interrelation [61, 124].

2.3.2 Extended Model Results

Continuing with the full SD SNS model that includes the periphery (cf. Figure 7), the first

finding is that the number of SNS users influences the adoption rate of external app developers

and advertisers. While this result seems intuitive for advertisers and app developers, it can only

7 For simplicity, only well-functioning privacy policies and controls are assumed.

32 2 Interrelations on Privacy in the SNS Platform Enviroment

be confirmed for the advertisers in the FB case where the revenue from advertising increases

with the number of users [124].

If advertising is a main revenue source for an SNS, the profiling capabilities and the ability to

show targeted advertisement to a precise target group becomes crucial for the SNO. In addition

to the capability to analyse data, gathering data through UGC and UDD is an important factor.

If the gained user trust and the concomitant increase in UGC from implemented privacy policies

and controls cannot compensate for the decrease in UDD, SNOs seem indifferent to the efficacy

of those controls. The trend of FB’s default privacy settings from 2004 to 2015 towards making

more personal details open to public by default supports this conjecture [125, 126], as do

theoretical models and discussions from the literature review [17, 29, 49, 112]. However, recent

trends in platform business seem to oppose this notion with better privacy by default preferences

and hints on how to use implemented privacy controls [122, 127].

The most interesting result of the model is the feedback-loop regarding the interrelations

between SNS competitors, the competitive market, and user MHG behaviour. The outcome of

the literature review shows that MHG strengthens SNS competitors and competitive market

pressure regarding the fight for users’ time (cf. subchapter 2.2.2). Causing SNOs to include

more features in their platform such that users will spend more time on them. The SNO benefits

as users disclose more data by spending more time in the SNS and the user expertise for the

specific network improves. The latter increases switching costs in terms of practice efforts if

users want to switch to or use another SNS, which weakens MHG behaviour.

These interrelations seem to fit with FB; while MHG behaviour becomes more common [61],

FB purchases and includes more external services (Instagram, WhatsApp), and broadens self-

developed features such as instant articles or recent streaming of exclusive TV series [128,

129]. This MHG feedback-loop supports the findings of Zhang and Sarvary, who suggested that

MHG behaviour leads to an overlap in content and to a winner-takes-all equilibrium where the

biggest platform wins when there is no longer any difference in content [111]. In addition, the

results of Kwon, and Staykova and Damsgaard regarding platform stickiness by additional SNS

features are relevant [44, 102]. However, these results contrast with the theoretical findings by

Doganglu and Wright that MHG in MSP should weaken competition [119]. This allegation

might be true regarding the competition for users but it misses the increase in competition for

user time spent. Statistics support the finding of the MHG feedback-loop and show that FB

leads significantly in daily usage against its competitors [61].

2 Interrelations on Privacy in the SNS Platform Enviroment 33

2.3.3 Limitations

The literature review contains only a small part of all available work, and additional literature

could be used to validate and refine the model and its results. In addition, some of the included

literature about multi-sided platforms does not fit with the SNS case. While some of the papers

only address MSPs theoretically and others analyse nearby platforms, these findings were

transferred to the SNS case. Concerning the theoretical models, which had contradictory results,

were based on different assumptions. Nevertheless, those models were all built to describe the

processes of, and those related to MSPs.

2.4 Conclusion

The aim of this chapter is to understand the interrelations in SNS and the ambient market

structure influence on users’ privacy in SNSs. A literature review was conducted, analysing 41

papers from the last 15 years concerning the topics of MSPs, TSMs, and privacy for SNSs. The

analysis of these papers is followed a framework that is loosely based on the work of Staykova

and Damsgaard to identify the main elements and interrelations of SNS [44]. The analysis was

separated into an SNS core, which contains only the fundamental functions of the platform, and

a broader concept of an SNS periphery, which includes all important parts in the SNS market

structure. In a second step, the SD approach was used to develop a qualitative model from the

results of the literature review. The procedure was divided into several steps to first model the

SNS core and then to extend this base with the findings of the SNS periphery.

The core model showed that the negative influence on user privacy is UDD, which is caused

by the amount of UGC that users add to the SNS and the time users spend within the network.

Furthermore, UGC is positively influenced by the number of users and the available SNS

features, while the time users spend on the SNS is also influenced by the number of SNS

features and users’ trust. The positive effects on privacy include implemented privacy policies

and controls and general user awareness. The enhanced periphery model revealed that UGC

and UDD are positive factors for the profiling capability of the SNO, which is sold in terms of

targeted advertising to customers. Moreover, the model shows that user MHG behaviour has a

positive influence on UGC and indirectly on the market competitiveness. Therefore, there are

secondary negative impacts on user privacy.

34 2 Interrelations on Privacy in the SNS Platform Enviroment

In subchapter 2.3, these findings were discussed and matched with empirical evidence and

media reports. The most remarkable finding is the MHG feedback-loop, showing that user

MHG behaviour strengthens SNS competitors and competitive market pressure, which leads to

SNO including more platform features and users spending more time in the network. This in

turn has two positive consequences for the SNO: First, users disclose more data when they

spend more time in the SNS. Second, user expertise for the SNS increases, which raises users’

non-monetary switching costs and weakens MHG. This might lead to a winner-takes-all

outcome for the SNS market in the long term. Recent market developments, data on user MHG

behaviour, and daily usage of SNS strongly support this finding.

35

3 Economic Modelling of Privacy in SNSs

The delineation of the complex SNS ecosystem in the previous chapter shows that SNSs contain

several economic processes, including the usage of PD as an asset class that is first traded from

SNS users to the SNO and then refined and sold in the form of targeted advertisement to

business customers. As a reward for this loss in privacy, SNS users gain the right to use the

SNS without further charge. This chapter is dedicated to answering research question RQ2.a

and RQ2.b, to determine whether the transfer of PD from users to SNOs can be interpreted as

a transaction method and whether the user privacy dilemma can be modelled by means of

classic economic theory. Therefore, this chapter is partitioned into two main parts: the first part

depicts the perception of PD as a PM for users in SNSs and the second part models the SNO–

user relationship as the principal-agent problem from classical economic contract theory.8

In accordance with RQ2.a, the guiding questions of the first subchapter determine whether PD

can be a PM for SNSs and whether SNS users show price sensitivity regarding the demanded

extent of PD. In a survey among 300 FB users, the interviewees were asked which FB functions

they used. In an experimental part within the survey, they chose from these functionalities given

a direct trade-off for specific PD exploitation rights. The results show that the interviewees are

sensitive regarding the price in terms of PD. In general, they chose fewer functions than they

claimed to user in FB, even if the demanded price was lower. The findings support the theory

that users misinterpret SNO data exploitation rights and indicate a strong IA between the SNS

users and SNO relating to the aggregation and usage of PD.

These results are addressed in the second subchapter to answer RQ2.b. Based on a commodity-

centric notion of privacy, it takes a principal-agent perspective on DCBs such as SNSs. This

subchapter presents an economic model of the related IA privacy problem by drawing from

classic economic contract theory. Building upon a critical analysis of this model, it is examined

how regulatory and technological instruments could balance profit seeking of markets for PD

and data-subjects’ right to informational self-determination and privacy.

8 This chapter contains extended versions of papers [28, 29].

36 3 Economic Modelling of Privacy in SNSs

3.1 Case Study: PD as a Payment Method in SNSs

Social Networking Services can be used for free in monetary terms. Nevertheless, studies

suggest that users are aware that they pay to use the functionalities of an SNS by revealing their

PD to the SNO for commercial exploitation [130, 131]. From this perspective, PD is a PM for

SNS. Thus, the question of whether SNS users care for privacy relates to the question of whether

SNS users show price sensitivity regarding the exploitation of PD as a non-monetary PM. In

the current situation, opportunities for SNS users to show this price sensitivity are limited

because they face a “take it or leave it problem” [132]. People only have the choice between

participating in internet social life through SNSs and accepting that all their disclosed

information will be exploited or not signing up with SNSs.

This subchapter presents the results from a survey of 300 FB users. The “take it or leave it

problem” was bypassed by directly asking the interviewees if they would choose single SNS

functions in direct exchange for specific PD exploitation rights. The survey indicates that users

show price sensitivity, as interviewees chose less SNS functions when more PD exploitation

rights were demanded. Moreover, the interviewees chose fewer functions than they use on FB,

even if the demanded price was lower than the FB data exploitation procedure.

Multiple linear regressions for interviewees’ decision with the quantity of chosen FB and SNS

functions as the dependent variable and the estimated values representing personal attributes as

explanatory variables provided insight.9 The outcomes support findings that users cannot fully

enforce their preferences inside the FB environment [72, 133]. Furthermore, the negative

influence of the education level only for the stated use of FB functions indicates that

recognizing the FB exploited amount of PD correlates with higher education, supporting similar

findings [134].

3.1.1 Related Research

Several researchers targeted the related questions about whether users value privacy and are

willing to disclose PD, and which factors influence them to do this. The question of whether

people are willing to pay a monetary amount for privacy in SNSs has been examined by

different scientists with negative results [74, 131]. Moreover, Stutzman et al. found that FB

9 Personal attributes include privacy awareness, education level and gender.

3 Economic Modelling of Privacy in SNSs 37

users exhibited increasing privacy-seeking behaviour (PSB) [72], whereas studies revealed that

scandals about data leaks in SNSs had no sustainable effect on users’ behaviour and use of

SNSs [132, 134]. A related topic is the privacy paradox, which explains that users state that

they value privacy but behave otherwise [71]. Moreover, there is a significant link between

privacy concerns and self-disclosure, indicating that “users do account for privacy risks when

they decide to self-disclose” [131]. Acquisti and Gross as well as Krasnova et al. found a

discrepancy between claimed privacy concerns and disclosure behaviour on SNSs, which could

be partially explained by the fact that users trust SNOs and network members, and this trust

relies on users’ ability to control access to PD [69, 106]. However, other scholars found that

changes in disclosing behaviour are partly explained by policy and interface changes

implemented in SNSs, which countered PSB [133, 135], and by a lack of awareness of and

control by the SNS users [136].

3.1.2 Hypothesis and Procedure

In classic economics, a PM is defined as “something a customer gives to a vendor, if she wants

to acquire a specific good or service” [83] or as “the agreed way in which a buyer pays the

seller for goods” [137]. Combining these definitions with the findings of Hui et al. that

“consumers are willing to give up privacy in return for other benefits” [130] indicates that PD

is recognized and accepted on the users’ side as a PM, where the functionalities of an SNS can

be seen as the described benefits. Moreover, PD is accepted from the SNOs as payment, as

collecting, aggregating, and analysing users’ PD is a crucial aspect of their business model (cf.

chapter 1.1.3). From this viewpoint, the disclosure of PD including the transfer of certain usage

rights to the SNO can be perceived as payment for SNS’ functionalities. Using disclosed PD as

PM and showing that users react price sensitive and value privacy, instead of searching for a

monetary value of users’ privacy, is a unique approach. In addition, controlling the experiment

results with multiple linear regression provides insight that supports current theories about why

SNS users initially behave according to the privacy paradox.

If users value privacy, they should react price sensitively to the trade-off of PD for SNS

functions. Therefore, the first hypothesis (H1) for this subchapter is:

H1: SNS users react price sensitively to PD as a PM.

38 3 Economic Modelling of Privacy in SNSs

Hypothesis 1 is difficult to verify in practice. While users already pay for SNS functions by

revealing their PD, they face an IA, which makes it complicated for them to show price

sensitivity. Whereas the SNO is fully informed about the amount of data and details which are

stored and how they are used, users seem neither sufficiently informed or enabled to choose the

extent of data collection and usage [72]. Since users face a “take it or leave it problem” [72],

the second hypothesis (H2) is:

H2: Users are unable to show their price sensitivity within current SNSs.

By conducting a survey of FB users and integrating an experimental component, the IA and the

“take it or leave it problem” were bypassed. General questions about FB and SNS usage

behaviour allowed for the calculation of interviewees’ personal attributes. In the second part,

the interviewees stated the FB functions that they used. In the experimental third part, the

interviewees answered whether they would choose several SNS functions in exchange for

certain PD exploitation rights. This part verified H1 while the comparison of the second and

third part showed some evidence regarding H2. In addition, the multiple linear regression

analysis with the personal attribute estimates as explanatory variables and the results of either

the second or the third survey part as the dependent variable yielded the results to prove H2

The survey was done in German, implemented with the Sosci-Survey software,10 and

distributed from 03/06 to 04/30/2014 through FB, Twitter, and email with a request to

redistribute the link. FB users were chosen as the target group for three reasons. First, FB is the

current SNSs market leader and a survey for FB users made it easy to reach a high number of

potential interviewees. Second, it was simple to share the survey’s link through FB, which made

it possible to reach more than 300 participants within the first 14 days. Third, it was helpful to

focus on only one SNS because the survey compares actual behaviour with potential behaviour.

Thus, mixing different groups from different SNSs could have led to misleading results.

There were four different questionnaires containing the same first and last parts, but varying in

the middle.11 In the first part, each interviewee was asked if she uses FB, five questions to

measure her privacy awareness, one question regarding her FB usage behaviour, and one

question about her number of FB friends. Each interviewee was then asked which FB functions

10 http://www.soscisurvey.de (accessed 07.12.2017).

11 The full questionnaire as well as additional survey results and graphics can be found in the appendix.

3 Economic Modelling of Privacy in SNSs 39

she uses regularly with a multiple-choice question. Seven different functions were available to

choose from, including chat, messenger, newsfeed, event management, photo uploading and

sharing, video uploading and sharing, and apps inside FB.

Each participant was allocated randomly into one of four different groups to distinguish

different function prices. The groups contained questions about the direct trade-off between PD

exploration rights against the same six functions. The groups differed regarding the price of the

functions. In group 1, this price equalled MND-z for all functions, the price was MND in group

2, it equalled MND+x in group 3, and it was MND+x+y in group 4. Where MND is the minimal

necessary data to run a specific function and the right to process the data by the SNO, while x,

y, and z are additional PD exploration rights.

Group 1: MND – z, where z = content of conversions and comments.

Group 2: MND.

Group 3: MND + x, where x = answering a marketing question.12

Group 4: MND + x + y, where y = updating relationship status.

In the third part, questions were asked about the highest educational achievement, gender, and

age. An illustration of the complete survey composition is shown in Figure 8.

Figure 8. Survey Composition.

12 As an equivalent to the FB-like-system.

1. Part

•Questions regarding SNS usage & behaviour (privacy awareness, friends, used functions).

2. Part

•Experimental part with separation in four different experiment groups.

3. Part

•Questions about personal basics (age, gender, educational level).

40 3 Economic Modelling of Privacy in SNSs

3.1.3 Survey Results

The survey's link was spread via FB and Twitter for 56 days, and it received 640 clicks;13 386

people started the survey and 320 completed it which equals a response rate of 60.3% regarding

started surveys and 50% regarding completed surveys. Of the 320 completed surveys, 309

attendees were using FB and thus fit into the target group. The following data and analysis was

computed from this target group. The mean age of the interviewee group was 27.7 years,

distributed between 14 and 75, and they had an average of 388.6 FB friends. Regarding gender,

the sample was balanced as 53% of respondents were male. Furthermore, most interviewees

had at least a high-school diploma (83.5%). The interviewees had high privacy awareness on

average, with a mean value of 0.71 where 1 was the possible maximum. Running the levene

test shows that all variables were evenly distributed through the four different experiment

groups.

Comparing the FB functions used by the interviewees with the functions in the experimental

part the results differ by a mean of 0.99. Thus, the interviewees chose approximately one

function less in the experimental part. By distinguishing between the different classes, the

results become clearer (see Table 3). In addition, the participants chose fewer functions if they

faced an offer that was notably less costly in terms of PD exploration rights than the actual price

demanded by FB, if the function price in group 3 approximately equalled the FB function price.

Experiment Group Price Mean Difference Mean Chosen Functions

Group 1 MND – z -0.75 2.63

Group 2 MND -0.96 2.37

Group 3 MND + x -0.84 2.14

Group 4 MND + x + y -1.59 1.89

Table 3. Chosen Functions for Different Experimental Groups

3.1.4 Regression Analysis

Bivariate regressions that explain the stated use of FB functions and the experimental choice

indicate that a higher interview group and a higher price in terms of PD exploration rights

influences the interviewees to choose fewer functions. Likewise privacy awareness influences

the interviewees to choose fewer functions, while being male has an opposite effect.

13 Including accident double-clicks and search engine bot visits.

3 Economic Modelling of Privacy in SNSs 41

Implementing significant variables in a multiple linear regression model with the chosen

functions as dependent variable (see Equation 1) provides the following results (see Table 4):

𝑔𝑟. 𝑓𝑢𝑛𝑐. 𝑎𝑙𝑙𝑖 = 𝛽0 + 𝛽1 ∗ 𝑒𝑥𝑝. 𝑔𝑟𝑖 + 𝛽2 ∗ 𝑝𝑟. 𝑎𝑤𝑎𝑖 + 𝛽3 ∗ 𝑔𝑒𝑛. 𝑚𝑎𝑙𝑒𝑖 + 𝛽4 ∗ 𝑎𝑔𝑒𝑖 + 𝛽5

∗ 𝑓𝑏. 𝑢𝑠𝑒𝑖 + 𝛽6 ∗ 𝑓𝑏. 𝑓𝑟𝑖𝑒𝑛𝑑𝑠𝑖

Equation 1. Multiple Linear Regression for gr.func.alli.

Coefficients Estimate Std. Error t value Pr(>|t|)

(Intercept) 4.65 0.94 4.93 1.39e-06 ***

exp.gr -0.21 0.10 -1.99 0.05 *

pr.awa -3.49 0.79 -4.41 1.44e-05 ***

gen.male 0.54 0.23 2.34 0.02 *

age -0.04 0.01 -4.00 8.14e-05 **

fb.use 0.34 0.13 2.76 0.01 **

fb.friends 0.001 0.0004 1.78 0.08 .

Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 1.95 on 284 degrees of freedom

(29 observations deleted due to missingness)

Multiple R-squared: 0.21, Adjusted R-squared: 0.19

Table 4. Multiple Regression for Selected Functions in the Experiment.

As expected, the experimental group affiliation (exp.gr) and the price have a negative impact

on the selected functions, as do privacy awareness (pr.awa) and age. The negative estimate of

privacy awareness is very significant and also highly influential with -3.49, representing that

an interviewee with a privacy awareness of 1 chose 3.49 functions less on average compared to

an interviewee with a privacy awareness of 0. Conversely, the estimates for male gender

(gen.male), degree of FB usage (fb.use), and number of FB friends (fb.friends) are positively

influential for the total number of functions chosen during the experiment, while the intercept

is positive too. It is remarkable that the bivariate regression shows that education has no

significant influence on the experiment choices.

The average interviewee chose fewer functions during the experiment if she was in an

experiment group with a higher price, had higher privacy awareness and is older. Alternatively,

the average interviewee chose more functions if she was male, had a high degree of FB usage,

and more FB friends. Comparing these results with a multiple linear regression to explain the

respective significant variables’ influence on the FB functions that were used (Equation 2)

completes the analysis (see Table 5).

42 3 Economic Modelling of Privacy in SNSs

𝑓𝑢𝑛𝑐. 𝑎𝑙𝑙𝑖 = 𝛽0 + 𝛽1 ∗ 𝑝𝑟. 𝑎𝑤𝑎𝑖 + 𝛽2 ∗ 𝑎𝑔𝑒𝑖 + 𝛽3 ∗ 𝑓𝑏. 𝑢𝑠𝑒𝑖 + 𝛽4 ∗ 𝑓𝑏. 𝑓𝑟𝑖𝑒𝑛𝑑𝑠 + 𝛽5

∗ 𝑒𝑑𝑢𝑐𝑖

Equation 2. Multiple Linear Regression for func.alli.

Coefficients Estimate Std. Error t value Pr(>|t|)

(Intercept) 4.00 0.70 5.69 3.24e-08 ***

pr.awa -1.19 0.60 -1.98 0.05 *

age -0.02 0.01 -1.95 0.05 .

fb.use 0.33 0.09 3.72 0.0003 *

fb.friends 0.001 0.0003 4.45 1.26e-05 **

educ -0.32 0.09 -3.79 0.0002 *** Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 1.39 on 275 degrees of freedom

(39 observations deleted due to missingness)

Multiple R-squared: 0.22, Adjusted R-squared: 0.21

Table 5. Multiple Regression for Used FB Functions

These estimates are distinctly lower than the first regression, except for the number of FB

friends and the variable for the degree of FB usage, which are about the same. The value for R2

and the explanatory power are also similar. Since the standard error of the intercept is lower

than that of the first regression model, the value for the total of used FB functions fluctuates

less than the value for the total in the selected experimental functions.

3.1.5 Regression Results

H1: SNS users react price sensitively to PD as a PM.

The experiment’s results indicate that by explicitly showing the users the collected and

aggregated PD for each function, the IA regarding the collected and aggregated PD was negated.

This led to changes in users’ choice behaviour: Overall, users chose fewer functions and their

privacy awareness, age, and gender were more influential on their choice. Moreover, users are

price sensitive regarding the functions’ prices in terms of PD and choose fewer functions if the

price rises. Thus, H1 is verified.

H2: Users cannot show their price sensitivity within current SNSs.

The reduced number of selected functions during the experiment compared to users’ FB choices

can be partly explained by the reduction of IA. Asking users for the approval to give specific

PD exploitation rights in direct exchange for activating a specific SNS function reduces the IA

3 Economic Modelling of Privacy in SNSs 43

regarding collected PD that is aggregated and analysed by the SNO and increases “the ability

of an individual to understand the flow of personal information within an SNS” [138].

Furthermore, the results indicate that the influence of education level is negated by the

experimental design, which enhances transparency regarding the demanded exploitation rights

on the users’ side. This insight supports the findings that users have misunderstandings

regarding the data collected by FB [106, 133].

There are other magnitudes that partly explain the observed results. First, users escape the lock-

in effect of FB in the experiment and can make their experimental choice independent of its

influence. Moreover, the experiment gives them the opportunity to make a new choice regarding

their privacy-functionality preference and avoid possible mistakes in choices they made during

their experience with FB. Since the interviewees could not observe the choices of their friends

in contrast to actual FB usage experience, they were not influenced by any kind of herd instinct

or social pressure. Overall, the results strongly indicate that H2 is correct.

The presented experiment and its results have limitations. First, the survey was conducted

online, and the related link was spread personally through FB and Twitter. Thus, the results

might suffer from a peer group problem. Furthermore, “often, survey participants are less

privacy conscious than non-participants. For obvious reasons, this self-selection bias is

particularly problematic for survey studies that focus on privacy“ [106]. Thus, it is possible that

the variable privacy awareness and its influence is underestimated in the survey and the

experiment because people with high privacy awareness could have avoided the survey.

Moreover, the regressions do not explain more than about 20% of the variation in the total of

used FB functions and the total of functions selected during the experiment. Thus, there may

be other more influential effects that are not covered within this survey. Lastly, the variable for

privacy awareness was not measured strictly according to the Internet Users’ Information

Privacy Concerns standard (IUIPC) for survey length and usability reasons [139]. With 309

attendees, the survey results are presumably representative of the target group of German FB

users, but cannot be considered as universally valid.

3.1.6 Overall Results

The survey and its included experiment revealed the following results. On average, users

decided to use fewer functions than they use on FB if they face the direct trade-off between

SNS functions and PD exploitation rights, even if the price of the functions is significantly

44 3 Economic Modelling of Privacy in SNSs

lower than in FB. Moreover, the total number of chosen functions decreases when the price in

terms of PD exploiting rights increases. Second, the variables for privacy awareness and gender

are more influential in the experimental setting than they are for users’ actual choices on FB.

Third, the fluctuation of the total number of function selected during the experiment was higher

than the fluctuation in the total number of functions used on FB.

The first hypothesis H1 that SNS users show price sensitivity regarding the trade-off between

PD exploitation rights and SNS functions is verified. Users value privacy but the results also

indicate that users cannot transfer the observed price sensitivity to FB due to various obstacles:

IA about the collected and exploited PD by FB, peer pressure to disclose personal information,

and network- and lock-in effects [68, 130, 131, 133].

The survey findings indicate that the second hypothesis, H2, is also assumed to be true. Users

cannot show their price sensitivity regarding PD as a PM on FB. Moreover, the negative

influence of the education level variable on the use of FB functions as well as its insignificance

for the experiment decision,14 support research outcomes suggesting that the real price of SNS

usage is veiled to the users [72, 133, 134].

The experiment results also show that the user position is strengthened by enhancing

transparency. Diminishing IA by clearly stating which PD is collected, aggregated, and analysed

for each SNS function helps users to overcome misunderstandings and state their privacy

preferences. As the IA is beneficial for SNOs and their revenue model of exploiting PD for

commercial purposes, it will not benefit SNOs when users optimize their privacy-functionality

trade-off and choose more privacy and less revealed PD.

14 The respective regressions are presented in the appendix.

3 Economic Modelling of Privacy in SNSs 45

3.2 Modelling the Principal-Agent Dilemma in SNS

From an economic perspective, this subchapter analyses how the previously identified IA can

be reduced by technological and regulatory instruments, when focusing on the „First Tier

Relationship Space” of markets for PD [66]. The focus is on the direct relation of the primary

data-controller and data-subject in the case of SNS, the SNO, and the user. This builds on DCBs

as defined by Müller et al. and the privacy problems that are inherent in this business model

[37, 42]. A detailed description of the model including the transformation of SNSs is provided

in subchapter 1.1.2.

As discussed above, SNOs provide SNSs to consumers free of charge and generate revenue by

providing third companies with the ability to present targeted advertisements to these

consumers. This platform comportment characterises SNSs as MSPs that cater to users of their

services and to advertisers [43]. The collection of PD and the generation of user profiles are at

the core of those DCB models, as these profiles build the foundation for delivering targeted

advertisements (cf. subchapter 1.1). It is in the interest of SNOs to collect as much PD as

possible to generate precise targeting profiles. In addition, SNS users can benefit from profiling

(e.g. through decreased transaction costs due to automatically personalized recommender

systems) [140, 141]. The extensive collection, analysis, and usage of PD affect and threaten

user privacy [37, 142]. In the context of SNSs and relating to RQ2.b, this chapter addresses the

following questions:

1. How can the relationship between users and providers be modelled in economic terms?

2. Which leverage points for balancing economic efficiency and privacy can be identified

and how can technological and regulatory instruments help to establishing this balance?

To describe the privacy problems in SNSs and identify leverage points for balancing efficiency

and privacy, this subchapter provides an economic model of the privacy trade-offs in SNSs. The

presented model builds on a novel principal-agent perspective of SNS that is rooted in a

commodity-centric notion of privacy. A critical analysis of the model will then show how

regulatory and technological instruments could balance profit seeking of PD market participants

and data-subjects’ rights to informational self-determination user privacy.

46 3 Economic Modelling of Privacy in SNSs

3.2.1 Related Work

The emergence of barely regulated markets for PD and their threats to privacy have not gone

unnoticed by academia. Scholars in the computer sciences, jurisprudence, and IS are

investigating legal and technological instruments to regulate such markets and provide

instruments for data subjects to exercise greater control over their PD. Notable approaches

towards organizing markets for PD have been proposed by Laudon [143], Schwartz [144], and

Novotny and Spiekermann [66].

Technological and legal instruments for addressing the privacy problems in PD markets were

recently discussed by Spiekermann and Novotny [145]. The following pages elaborate on their

commodity-centric notion of privacy, in that considering only PD usage rights tradable.

Acquisti provides an economic model of privacy trade-offs in electronic commerce by

focussing on data subjects’ decision process and arguing for models based on psychological

distortions [146]. However, he does not investigate the perspective of the data controller and

the structure of the market. Furthermore, Chellappa and Shivendu provide a model for game-

theoretic analysis of property rights approaches to privacy, considering only monopolistic

markets [79]. In contrast to the existing work, the model provided in the following section uses

a principal-agent perspective and focuses on the market structure in SNSs.

3.2.2 Model Assumptions

The identification of leverage points for balancing market efficiency for PD and data-subjects’

privacy in SNS requires a model to describe the market, its agents’ behaviour, and the market’s

power structure. First, SNS and the assumptions underlying the model are elaborated on, and

then the principal-agent model of SNS is provided, building on a commodity-centric notion of

privacy. The model is based on the following three assumptions:

1. Usage rights to PD are transferable and tradable.

2. Providers and users act rational and have homogeneous utility functions within their

constraints.

3. Users are privacy pragmatists who are willing to substitute privacy for functionality to a

certain degree.

Assumption 1: Among many others, Campell and Hanson as well as Davies argued that the

growing economic importance of PD is paralleled by a shift in the public perception of PD and

3 Economic Modelling of Privacy in SNSs 47

a reconceptualization of privacy. The latter moves privacy from the domain of civil rights to the

domain of commodification, such that PD is increasingly considered a tradable commodity that

is separate from the individual [147, 148]. In the wake of this shift in perception, property rights

for PD have increasingly been debated not only in law, but also in IS and computer sciences

(cf. [52, 144, 149–152]). Purtova has shown that the current European data protection regime

“endorses the ‘property thinking’ with regard to personal data” and that data-subject ownership

claims to PD are compatible with the principle of informational self-determination [52].

However, the human rights aspect of privacy excludes full objectification of PD, as ownership

claims to it cannot be alienated [144, 149]. This model does not consider a full transformation

of PD as a property. Instead, it follows Spiekermann and Novotny and only considers usage

rights to PD as transferable and tradable [145].

Assumption 2: For mathematical simplicity, it is assumed that all agents within the model,

including providers and users, act rationally under constraints. Thus, an agent will perform any

action that increases her expected utility and will avoid any action that has no positive expected

utility for her. Moreover, it is assumed that users and SNOs have homogeneous utility functions,

which allows for the utilization of a single expected utility function for all users and a single

utility function for all providers within the model. These assumptions are also common and

necessary for the standard principal-agent model [153].

Assumption 3: Similar to Westin and Ackerman et al. “pragmatic users” are the centre of this

investigation [154, 155]. Pragmatic users are concerned about their privacy (i.e. the usage of

data regarding them) but weigh their concerns against the benefits of disclosing data about

themselves. This user model is supported by current research which shows that users are willing

to engage in online transactions and disclose PD if the perceived benefits of doing this outweigh

the cost, including the perceived cost of reduced privacy [106, 156, 157].

As defined above, users of SNSs may receive benefits from SNOs’ data aggregation and

analysis. However, to reap these benefits from data processing, users must entrust data that

relates to them to an SNO (i.e., transfer data usage rights to the provider). Given commodity-

centric perspective on privacy, the relationship between users and providers is considered a

principal-agent relationship [158]. In this relationship, users suffer severe IAs as shown

beforehand (cf. chapter 3.1 and [37]). Thus, users face a problem of moral hazard (MH); they

face the risk that providers exercise transferred usage rights in ways users do not prefer [153].

48 3 Economic Modelling of Privacy in SNSs

The classic approach to describing and investigating solutions to MH problems in principal-

agent relations is provided by the contract theory [153]. The classic model is based on the idea

that the principal and agent negotiate a contract and the principal pays the agent a price that will

urge the agent to follow strategies that benefit the principal, rather than following strategies that

solely maximize the agent’s own benefit. As such, the risk of MH is minimized by the principal,

though the classic contract theory model cannot be applied directly in the context of SNSs. As

shown above, current SNS users (the principal) undoubtedly enter a contract with the SNO (the

agent) by agreeing to its terms of usage without any latitude to set their own terms to this

agreement. Therefore, the user cannot negotiate this contract and must accept the conditions set

by the SNO.

The user also transfers usage rights to PD to the SNO such that she can reap benefits from the

SNO’s usage of the data. However, based on the contract, the user transfers more data and wide-

reaching usage rights than necessary (and possibly desired by the user) for receiving the desired

benefits, and pays a price to the SNO (cf. chapter 3.1). The SNO sets this price which does not

urge the SNO to act in the user’s interest. In fact, regarding privacy, the price is set to maximize

benefits for the SNO. Although the user is seldom able to fully comprehend common privacy

policies or terms of usage [159, 160], the model assumes that the user expects the SNO to be

able and eager to collect more PD than technologically necessary and to use the data for defined

purposes (e.g. for advertising) beyond providing the desired service (see assumption 2). Current

research supports this assumption and has shown that users engage in privacy-seeking

behaviour when on an SNS [72].

3.2.3 Designing the Principal-Agent Model

Game theory addresses problems where the probability of a certain outcome is unknown. In

this approach, users face a decision with a risk but not with complete uncertainty. While the

probabilities cannot be determined precisely, general outcome probabilities are roughly

deducible. For example, users can infer the probabilities for some extreme outcomes based on

media reports about data leakage scandals or similar reported events. Thus, following the classic

model, the user’s expected utility is presented as a concave Neumann-Morgenstern function

[153]. This allows the model also to consider different user types, such as risk-averse privacy

pragmatic users. The user’s expected utility is formulated as follows:

3 Economic Modelling of Privacy in SNSs 49

𝐸𝑈 (𝑎, 𝑠, 𝑥, 𝑟, 𝑧) ≔ 𝜋(𝑠, 𝑎) − (𝑔(𝑥, 𝑟, 𝑧) + 𝑧)

Equation 3

The user desires the SNO to perform action â (i.e., provide the desired services and exercise the

transferred usage rights solely to provide these services). Thereby â is part of a finite set of

possible provider actions A. Depending on the random variable s and the action a chosen by the

provider, the user receives the outcome function (s, a), with s ϵ S, S = {s1, . . . , sn}, p(s) ϵ [0, 1]

as a random variable that is individually drawn for each user. The outcome function (s, a)

accounts for the SNO’s ability to take action a (e.g., exercise usage rights for purposes

undesired by the user) because the outcome (s, a) partly depends on the chance that the user

cannot compare it to the desired outcome (s, â).

The user cannot determine which data and which usage rights are necessary to provide the

desired services (see MND). For example, an average user cannot estimate which data and

which usage of that data is necessary for providing the characteristic features of an SNS (as

described in chapter 1.1.2). The cost of providing the SNS at the technologically minimum

necessary amount of data and transferring the respective usage rights is represented by xmin >

0. The user’s maximum desired cost of disclosing data and transferring usage rights is

represented by ˆx. As described above, the user suspects that the SNO will collect more data

than technologically necessary to provide the desired SNS and use the transferred usage rights

for purposes (specified in the terms of usage) beyond providing the desired services. For

simplicity, illegal data usage by the SNO is not considered in the model. The user’s expected

cost of data disclosure and transfer of usage rights are represented by x with x ≥ xmin > 0.

User PSB accounted for by representing the subjectively expected privacy-related overall cost

of using the SNS by (g(x, r, z) + z) with g(x, r, z) ≥ xmin, z ϵ [0, 1]. For simplicity and clarity, the

costs of using the service are neglected (e.g. expenditure of time). In the construct (g(x, r, z) +

z), the variable z ϵ [0, 1] represents the cost that a user incurs when engaging in PSB, trying to

reduce x using privacy-enhancing technologies (PET) or by adjusting privacy settings [161]. If

the user does not engage in PSB then z = 0, and if the user engages in PSB to the maximum

extent technologically available then z = 1. It is assumed that the user is aware that PSB is not

necessarily successful (i.e. does not necessarily decrease x; e.g., data might be inferred

anyway). This uncertainty is represented by the random variable r, which is individually drawn

for each user, with r ϵ R,R = {r1, . . . , rm}, r ϵ [0, 1] and the probability q(r) ϵ [0, 1]. In this

50 3 Economic Modelling of Privacy in SNSs

construct r represents the chance of success of a user’s PSB, with r = 0 meaning no success at

all (i.e., g(x, r, z) = x). If r = 1, success depends on the invested z (i.e., g(x, r, z) = ˆx (or g(x, r,

z) = xmin if ˆx < xmin) provided z = 1). Furthermore, if r = 1, z ϵ ]0, 1[ then g(x, r, z) ϵ ]xmin, x[.

The SNO’s expected utility from providing a user with SNS is formulated as followed:

𝐸𝐹(𝑎, 𝑥, 𝑟, 𝑧) ≔ 𝑓(ℎ(𝑥, 𝑟, 𝑧)) − 𝑐(𝑎)

Equation 4

The provider aims to receive x (i.e., data relating to the users and the respective usage rights)

and incurs the cost c(a) of its action (providing the SNS and exercising usage rights to user’s

data), with c(a) ≥ 0 and c(a) = 0 only for a = 0. With the utility function f(h(x, r, z)), the SNO

expectations depend on the expected effectiveness of users’ PSB, i.e., on r, z, and x, which is

accounted for by the outcome function h(x, r, z). In any case, the provider receives at least h(x,

r, z) ≥ xmin. Given the high information asymmetries in SNSs (chapter 3.1 or [37]), the SNO is

in a position to set the price in terms of data transfer and usage rights (i.e., to set x as high as

possible for profit maximization). Thus, the SNO aims to establish contract x(π), which

maximizes:

∑ 𝑝(𝑠𝑖) ∗ ∑ 𝑞(𝑟𝑗) ∗ 𝐸𝐹 (𝑎, 𝑔 (𝑥(𝜋(𝑠𝑖, 𝑎)), 𝑟𝑗 , 𝑧), 𝑟𝑗 , 𝑧)

𝑚

𝑗=1

𝑛

𝑖=1

Equation 5

While the assumed pragmatic user is willing to trade privacy for the benefits of using an SNS,

even a pragmatic user is not willing to completely substitute privacy with functionality (cf.

subchapter 3.1 and [154, 155]). Thus, the assumed pragmatic user will refrain from using a

specific SNS if she expects the SNO to collect data and exercise data usage rights to an extent

far beyond that which is desired. In that case, EU < U0 where U0 is the user’s expected utility

from not using a service at all. Similar to the classic contract theory, this constraint is

represented as follows:

3 Economic Modelling of Privacy in SNSs 51

∑ 𝑝(𝑠𝑖) ∗ ∑ 𝑞(𝑟𝑗) ∗ 𝐸𝑈 (𝑎, 𝑠𝑖 , 𝑥, 𝑟𝑗 , 𝑧) ≥ 𝑈0

𝑚

𝑗=1

𝑛

𝑖=1

Equation 6

As non-monopolistic markets are assumed, users can choose from several SNOs (e.g. different

SNSs) to obtain similar benefits. Thus, a privacy pragmatic user will only use a specific service

if, in addition to the constraint formulated above, the following constraint holds:

∑ 𝑝(𝑠𝑖) ∗ ∑ 𝑞(𝑟𝑗) ∗ 𝐸𝑈 (𝑎, 𝑠𝑖, 𝑥, 𝑟𝑗 , 𝑧) ≥ ∑ 𝑝(𝑠𝑖) ∗ ∑ 𝑞(𝑟𝑗) ∗ 𝐸𝑈 (𝑎′, 𝑠𝑖 , 𝑥′, 𝑟𝑗 , 𝑧)𝑚𝑗=1

𝑛𝑖=1

𝑚𝑗=1

𝑛𝑖=1 .

Equation 7

Here a’ and x’ represent actions of an SNO’s competitors and the data and data usage rights to

be transferred for using their services. If competitors exist, the provider cannot completely

neglect the assumed pragmatic user’s concerns for privacy. Again, there are some constraints

to that equation. For simplicity, the model does not include positive direct network effects

between users as described in chapter 1.1.2.

3.2.4 Balancing User Privacy and SNO Profit-Seeking

The presented model exhibits some limitations. In line with most economic models, it builds

on the assumption of rational agents. While the model can be adapted to model privacy-affine

or privacy-uninterested users by adapting U0 or g(•), the model does not consider psychological

distortions [146]. Furthermore, the model covers the MH problem on the user’s side, but it does

not consider the inverse IA suffered by the SNO with respect to the user’s characteristics. While

it does account for the IA regarding users’ PSB, it does not account for possible misuse of

services by the user (e.g., data crawling and reselling). The model also does not consider illegal

behaviour by the SNO (e.g., privacy policy violations or non-compliance with data protection

regulation). Moreover, other limitations that might restrict an SNO’s ability to collect and use

PD are not included (i.e. usability, design, and technical constraints or legal restrictions).

Finally, the positive direct network effects between users within the same SNS are rudimentarily

covered by the outcome (s, a), but not explicitly included. Considering these limitations, the

opportunities to achieve a balance between user privacy and SNO profit-seeking by means of

technological and regulatory instruments are discussed in the following.

52 3 Economic Modelling of Privacy in SNSs

In a state of perfect competition with SNOs as price-takers [153], the SNOs would be forced

by competition to set x as low as possible to cover their marginal cost of the provided action a.

If the MND and usage rights xmin to perform action a suffice to cover the marginal costs of

providing it to yet another user, the x demanded by all SNOs would consequentially be xmin and

an equilibrium would exist for x = xmin with a ≥ a’.

However, the current SNS market is far from a state of perfect competition. The actual market

situation indicates that the branch is dominated by FB with roughly 85% of the market share,

and that FB is flanked by small competitors competing for the left-over market [162]. The

current market situation resembles a monopolistic situation where the SNO maximizes its

revenue and therefore x, with the only constraint of delivering a service with expected user

utility equal to or greater than the user’s utility without any service: EU ≥ U0 (see Equation 6).

Thus, the dominant SNO is currently in the position to establish contracts with its users, but not

necessarily in all cases, such as that of x > ˆx > xmin [72]. If EU ≥ U0 with x > ˆx there are few

incentives for the SNO to establish contracts that are more privacy-friendly, (i.e., set x = ˆx and

a = â). Three scenarios in the first relationship tier to be investigated can be distinguished [66]:

1. Users do not consider privacy a competitive factor.

2. Users perceive privacy as a competitive factor, but they cannot determine SNOs’ level of

privacy-friendliness.

3. SNS markets are currently monopolistic.

a. Users perceive privacy as worthy of protection.

b. User do not perceive privacy as worthy of protection.

These scenarios require different approaches towards balancing market efficiency and privacy.

Furthermore, whether there is a need for privacy protection at all must be investigated. Some

scholars have argued that privacy protection generally decreases efficiency and general welfare

[63]. While full transparency and full information might increase efficiency, the human rights

aspect of privacy excludes pure efficiency-focused approaches towards markets for PD [15].

Regardless of which scenario currently exists, a balance between efficiency and privacy in SNS

businesses must be established.

3 Economic Modelling of Privacy in SNSs 53

Which instruments are suited for achieving this balance depends on the SNS market structure.

To identify and discuss instruments for balancing privacy and efficiency the scenarios provided

above are analysed in the following section. There are three high-level approaches to balancing

privacy and efficiency: market-centric approaches, regulation-centric approaches, and user-

centric approaches. The differentiation criteria for these approaches are their primary

instrument for balancing privacy and SNO’s profit-seeking in SNS. Table 6 provides an

overview of the approaches and their applicability to the scenarios described under the premise

that privacy requires protection.

Table 6: Applicability of high-level approaches in different scenarios

Scenario 1: Users do not consider privacy as a competitive factor

In scenario 1 (S1), pure market-centric approaches (i.e., regulatory laissez-faire or incentive-

entered interventions) are not suited to fostering increased privacy-friendliness in SNS markets

as both SNOs and users have no self-motivated incentives to provide or demand respective

services. The same is true for user-centric approaches. For privacy to be achieved in S1, only

regulatory action and a (soft-) paternalistic regulatory regime can be applied. The presented

model does not aim to provide insight into the challenges of such an approach and thus, S1 is

not considered further in this analysis.

Scenario 3: The SNS market is currently monopolistic.

In scenario 3 (S3), market-centric approaches are not suited to balancing efficiency and privacy.

In S3.b, user-centric approaches are not well suited, as users have no incentive to take action to

protect their privacy. In S3.a, users, such as the pragmatic user of the presented model, have

incentives to expend z > 0 to protect their privacy. Provided PSB is effective and users can

determine its effectiveness, user-centric approaches can lead to increased privacy in S3.a, but

only on an individual level when PET is used on the users-side [161].

Market-Centric Regulation-Centric User-Centric

Privacy through market

mechanisms, primarily driven

by incentive-based

approaches, possibly

supported by technological

instruments and/or regulatory

instruments.

Privacy through regulatory

instruments, focussing

primarily on prohibition,

possibly supported by

technological instruments

and/or market mechanisms.

Privacy through technological

instruments, primarily driven

by user demand, possibly

supported by market

mechanisms and/or regulatory

instruments.

Scenario 1 - X -

Scenario 2 X X (X)

Scenario 3 - X (X)

54 3 Economic Modelling of Privacy in SNSs

Purely user-centric approaches would not change the market structure and an approximate

monopoly situation would continue. Therefore, regulatory action to weaken or break the

monopoly and enable and increase market competition would be necessary. To achieve a

balance between profit-seeking and privacy, only regulation-centric approaches are applicable

in S3. Such approaches would need to convert S3 to S1 or S2 and subsequently perform

regulatory action, as described above, to achieve a balance between privacy and efficiency.

Chellappa and Shivendu propose the introduction of property rights to personal information as

a regulatory approach to balancing profit seeking and privacy in a monopoly [79].

Scenario 2: Users perceive privacy as a competitive factor, but they cannot determine

SNOs’ level of privacy-friendliness.

Scenario 2 (S2) exhibits characteristics that are similar to those of ‘lemon markets’, which are

doomed to fail in the long term [163]. While users in S2 value privacy, they cannot determine

the privacy-friendliness of an SNO ex ante or ex post entering a contract. Thus, SNOs have no

incentive to compete on privacy and compete on functionality instead. From a long-term view,

privacy-friendly SNOs would leave the market due to their lower profits caused by a lower x,

and the market would fail as no balance between privacy and profit-seeking could be achieved.

In classic lemon markets and in SNS markets, the problem is rooted in information and power

asymmetries [37]. In S2, classic instruments for reducing these asymmetries seem best suited

to achieving a balance between privacy and market efficiency. Because of the principal-agent

relationship in SNSs and the human-rights aspect of privacy, further instruments and the

suitability of classic instruments for reducing information and power asymmetries must be

investigated for the context at hand. In the following paragraphs, technological and regulatory

instruments for balancing privacy and efficiency in S2 are discussed. Figure 9 provides an

overview of the analysed instrument categories.

Signalling and screening are instruments for reducing information asymmetries ex ante

establishment of a contract [83]. The informed party can utilize signalling instruments to signal

its characteristics to the uninformed party to reduce the IA and convince the uninformed party

to establish a contract with the signalling party instead of with another party. Signalling,

however, can only be a successful mechanism if the uninformed party has good reason to trust

in the signal (i.e., the cost of falsely signalling a characteristic while not exhibiting it must be

high, and ideally exceed the benefits of doing this). Screening can be perceived as inverse

signalling, as the uninformed party can utilize screening instruments to reduce IAs by actively

3 Economic Modelling of Privacy in SNSs 55

trying to identify the informed party’s characteristics. In the context of SNS markets, an SNO

has superior information regarding xmin, x, and a (i.e., it is the informed party) [37]. Signalling

and screening are instruments for market-centric approaches to balancing privacy and

efficiency (see Table 6).

Drawing from the literature, transparency-enhancing technologies (TETs) that are applied

before establishing a contract (ex ante TET) were identified as potential instruments for

signalling in the SNS market [164, 165]. Ex ante TETs comprise all TETs that are applied before

using a service and include tools for policy visualization (e.g., PrivacyBird)15, privacy seals

(e.g., the European Privacy Seal),16 and other instruments for providing information regarding

intended data collection and usage (i.e., information on x, a, and possibly xmin). A variety of ex

ante TETs exist, however, their suitability to balancing privacy and efficiency in SNSs is

limited. While tools for policy visualization can signal intended PD collection and usage (i.e.

15 http://www.privacybird.org (accessed 07.12.2017).

16 https://www.european-privacy-seal.eu (accessed 07.12.2017).

Figure 9: Instruments for Addressing the Privacy Problem in SNS.

56 3 Economic Modelling of Privacy in SNSs

x), they do not provide users with information regarding the actions that an SNO performs (i.e.

a). Privacy seals can be a valid instrument for signalling if they are issued by a trustworthy

party and the criteria for awarding the seals are known to users.

Screening originally refers to actions of the uninformed party that aim to induce the informed

party to actively reveal its characteristics during negotiation of the contract [83]. Technological

instruments for policy negotiation exist (e.g. P3P/APPEL, XACML, and the approaches

provided by Pretschner et al., and Hanson et al. or Bohrer et al.) [166–169]. However, these

mechanisms are not supported by a relevant SNS, which is not surprising given the power

relations in the market described above. Reputation services (RSs) are existing and actively

used mechanisms that resemble classic screening for SNSs, in that they allow the uninformed

party to reduce IAs ex ante establishment of a contract [170]. These mechanisms include crowd-

sourced RSs such as “Web of Trust”17 or “TOS;DR”18 or RSs aimed at allowing users to rate

other services. RSs can allow users to gain insight into an SNO’s behaviour. However, crowd-

sourced RSs are not well suited to provide meaningful information regarding xmin, x, or a as

other users cannot fully determine the SNOs’ actions (even if they have already established a

contract with a specific SNO). If the SNO grants wide-ranging insight into xmin, x, or a after

establishing a contract, crowd-sourced RSs can constitute effective instruments for estimating

xmin, x, or a. However, no major SNO already does this. In addition to instruments that are

applied ex ante establishment of a contract, instruments that can be applied to reduce

information and power asymmetries ex post must be investigated. Furthermore, some

instruments, especially regulatory ones, exist that cannot be categorized as ex ante or ex post

instruments.

Users can apply user-side PETs [161] at cost z to reduce the information they disclose and an

SNO’s power to exercise usage rights to PD, including tools for anonymity (e.g., Tor)19 and

obfuscation (e.g. TrackMeNot [171]). Since data minimization on the user side does not help

users learn the hidden characteristics of the SNO, user-side PETs are considered instruments

for diminishing power asymmetries. While user-sided PETs do not allow users to estimate xmin,

x, or a, they can reduce the privacy-related costs of using SNSs ex post establishing of a contract

17 https://www.mywot.com (accessed 07.12.2017).

18 https://www.tosdr.org (accessed 07.12.2017).

19 https://www.torproject.org (accessed 07.12.2017).

3 Economic Modelling of Privacy in SNSs 57

by reducing h(x, r, z). Usage control tools (as presented in e.g. [172, 173]) in combination with

policy negotiation tools can also be applied to reduce power asymmetries (and IAs if policy

negotiation can be used for screening) by giving users means for setting rules for the exercise

of usage rights by the SNO (i.e. for influencing a). Ex post TETs aim to provide users with

insight into actual data collection and usage [164]. The most prominent class of ex post TETs

are privacy dashboards. Depending on their functionality, they can be considered instruments

for reducing IAs or instruments for reducing power asymmetries. While read-only ex post TETs

are instruments for reducing IA ex post establishment of the contract, interactive ex post TETs

are instruments for reducing power and information asymmetries [174]. Whereas ex post TETs,

and privacy dashboards in particular, seem to be promising approaches to balancing privacy

and efficiency in SNSs, current approaches (as proposed in e.g. [175, 176]), such as the privacy

dashboard provided by Google,20 do not provide trustworthy information and are not well suited

to balancing privacy and efficiency [174].

Accountability-centric approaches are currently widely discussed as methods for balancing

privacy and efficiency. Privacy by accountability inherently requires a combination of

technological and regulatory instruments [177, 178]. Respective approaches to privacy build

upon audits to determine SNOs’ adherence to data protection regulation and agreed-upon

polices. A central concept within accountability-centric approaches towards privacy is liability,

(i.e., sanctioning providers in the case of noncompliance with regulation and agreed-upon

policies). While accountability-centric approaches to balancing privacy and efficiency are

promising and increasingly investigated, no respective solution exists.

Regulatory action towards reducing IA is currently being conducted in the new GDPR [34].

Regulatory instruments can set the legal framework to reduce IA ex ante and ex post

establishment of contracts and can support both regulation-centric and market-centric

approaches towards balancing privacy and efficiency. Another pure regulatory approach is the

assignment of property rights to PD, which is debated in the fields of law, IS, and economics.

20 https://myaccount.google.com (accessed 07.12.2017).

58 3 Economic Modelling of Privacy in SNSs

3.3 Conclusion

The survey and the included experiment presented in chapter 3.1 proved that PD can be

interpreted as a PM in SNS, that users are price sensitive and pay for SNS functionalities with

PD, and that an IA regarding the collection and use of PD in current major SNSs exists at the

users’ expense. In addition to the IA about the aggregated and analysed data, users are exposed

to other effects which urge them to reveal more PD as self-intended (i.e. peer pressure to

disclose network and lock-in effects) [68, 130, 131, 133]. Furthermore, the experiment

outcomes support former study results, which suggest that the real price of SNS usage in terms

of PD disclosure, aggregation, and analysation is veiled to the users by different mechanisms

[72, 133, 134].

Building on these insights and drawing from contract theory, a principal-agent model of the

privacy problem and trade-off in SNS was provided in chapter 3.2. Asymmetries of information

and power were identified as the primary leverage points for balancing efficiency and privacy

in the SNS market, and in non-monopolistic markets, privacy is perceived as a competitive

factor by users and SNOs. Thus, SNOs have the incentive to provide users with increased

transparency and control regarding their PD in perfect competition, though the actual SNS

market situation is almost monopolistic [162]. The presented model reveals that regulatory

pressure might be necessary to foster competition in the current SNS market and enhance the

competitive role of privacy. Based on the analysis, it was concluded that a transparency-

fostering regulatory regime in combination with trustworthy ex ante and ex post TET,

respectively accountability mechanisms, seems best suited for achieving a more privacy-

friendly balance of efficiency and privacy in the current SNS market. Adopting a commodity-

centric notion of privacy into law might further increase users’ ability to exercise their right to

informational self-determination without losing the benefits of SNS.

59

4 SNS Privacy and Regulation

In the classic economic theory, governmental regulation is necessary if markets are imperfect

and cannot find an efficient equilibrium [83]. The chapters above demonstrate that SNS users

suffer from information and power asymmetries and cannot enforce their privacy preferences

(cf. chapter 3). Furthermore, competition in the SNS market is detrimental to user privacy rather

than enforcing it, and the markets are moving towards a monopoly situation (cf. chapter 2).

Accordingly, a governmental intervention or regulation of the SNS market is appropriate.21

The objective of this chapter is to answer RQ3 about whether the user privacy problem in SNSs

can be solved with regulative approaches. Therefore, this chapter is divided into two major

parts. The first subchapter 4.1 compiles a conceptual framework of user privacy in SNSs to

validate regulative approaches concerning their privacy coverage and sustainability. The

subchapter provides a qualitative, conceptual framework for user privacy in SNSs to validate

privacy regulations. For the design process of the framework, different user privacy theories as

well as privacy threats and types of PD were identified from the recent, scientific literature. The

presented framework combines state-of-the-art privacy theories structured in three dimensions,

including time as a unique approach. It covers all identified privacy threats and types of PD.

The framework can be used to evaluate privacy policies and legislations, and as a basis for

privacy impact assessments (PIA) and privacy by design (PBD) approaches for SNSs and

DCBs.

The following subchapter 4.2 draws on the framework developed previously to evaluate the

current leading privacy legislation, the EU GDPR, and to answer RQ3. In accordance with the

framework dimensions, the GDPR is analysed for its privacy-relevant segments, which are then

examined regarding their implementations for user privacy in SNSs, and for their juridical and

economic sustainability and technical feasibility. Moreover, the effects of GDPR on the

dynamics of the SNS market regarding user privacy are assessed. A conclusion for the entire

chapter is then provided.

21 This chapter includes and extends the paper [32].

60 4 SNS Privacy and Regulation

4.1 Economic Requirements for SNS Privacy Regulation

The topic of privacy in SNSs has been addressed by numerous researchers from different fields

over the past two decades. Different theories, frameworks and typologies for privacy have been

published from various professions to explain this complex concept for diverse scopes

including SNSs (cf. subchapter 1.2). Moreover, privacy concerns have led to different data

protection legislations worldwide with a wide range of strictness. This subchapter approaches

the privacy field from a political economics perspective. The aim is to develop a conceptual

framework for online privacy in SNSs based on existing theories and supplemented by two

different time points as a unique dimension. This framework enables the evaluation of the

purview and efficacy of policies and regulations as well as the SNOs consequential measures

concerning user privacy.

First, an overview of privacy characterisations, frameworks, and theories from recent literature

is given. The results from this research are then used to identify the main privacy threats from

the user perspective and the different types of PD in SNSs. The multidimensional, conceptual

privacy framework is developed from different existing privacy typecasts and theories enriched

with a new time dimension. Then it is shown that before identified privacy threats and SNS data

types are covered by the privacy framework. In the last step, the results are evaluated and

summarised.

According to Maxwell, a conceptual framework is based on a corresponding literature review

[96]. Consequently, the history of privacy categorizations and frameworks was determined by

a short literature review based on the concepts of Rowley [97]. Building on this, the

multidimensional privacy framework was developed and explained, which represents a

simplified mapping of privacy and its dimensions in SNSs. Therefore, the notion of Becker

[179] was followed and the framework was systematically developed from prior perceptions,

using the contributions as modules for the work. The existing theories and typologies are

condensed for online privacy in this conceptual framework. The resulting framework contains

different, hierarchical classifications which can be addressed and fulfilled by corresponding

regulations, though such a framework can never be considered complete or ultimate. The

following history of online privacy shows that technological progress and new scientific results

will always generate the need to expand or modify existing privacy concepts, typologies, and

frameworks.

4 SNS Privacy and Regulation 61

4.1.1 Related Literature

The literature presented in the following section was identified by searching Google Scholar,

the IEEE Xplore database, and the ACM digital libraries for the words “privacy” and SNS or

OSN in combination with the terms “framework”, “typology” or “types” as well as “threats” or

“challenges”. Based on the results, a citation pearl growing search was conducted with the

appropriate documents according to the literature review concepts of Rowley [97]. The

corresponding overview of privacy types and typologies is presented in subchapter 1.2 of the

introduction. The following paragraphs complete the needed foundation for the privacy

framework development, and present existing privacy frameworks and threats for SNSs user

privacy.

Concerning privacy frameworks, the framework by Carew and Stapleton developed in 2005

covered most of the in subchapter 1.2 presented privacy concepts and grouped them into five

dimensions of privacy (physical, social, psychological, informational and global), each

containing several different privacy aspects [53]. Dinev et al. created a privacy framework for

the Web 2.0 context, “linking privacy and its various correlates together” and identifying

anonymity, secrecy, confidentiality, and control as “relevant correlates” [180]. In 2014,

Crawford and Schultz addressed privacy threats in BD from a law perspective and developed a

model of procedural data due process as a response [181]. Furthermore, Trepte et al. examined

internet users’ online privacy attitudes and behaviours empirically, and developed the online

privacy literacy scale (OPLIS) which differs between five dimensions of online privacy literacy

[182].

Petkos et al. used a quantitative method to study privacy and introduced a framework for raising

SNS users’ awareness about their disclosed data [183]. They distinguished PD into eight

different dimensions with different attributes and values to calculate the PScore. A similar

approach was taken by Zeng et al., who quantitatively evaluated the privacy risk for SNS users

as a function of their privacy awareness with their TAPE framework [184]. However, it is

important to note that the conceptual privacy framework developed in the following is not solely

focused on the users and their privacy awareness, but on the policies and measures to protect

and restore all relevant dimensions of user privacy in SNSs. Accordingly, the framework is not

quantitative, but contains qualitative dimensions with conditions that can be fulfilled by

regarding regulations and provisions.

62 4 SNS Privacy and Regulation

Regarding the issue of privacy threats for SNSs, one of the earliest papers summarizing

corresponding threats was published by Abril in 2007 [185]. She outlined the danger of

unintended audience and the increasing risk which occurs when postings and pictures are

revealed to third parties which allows them “to manipulate and further disseminate” them [185].

Also Preibusch et al. investigated privacy problems “not bounded by the perimeters of

individuals but also by the privacy needs of their social networks” [186]. Fire et al. provide a

comprehensive review of privacy and security threats in SNSs and present corresponding

solutions [65]. They grouped the privacy threats into four different categories: classic threats,

modern threats, combination threats, and threats specifically targeting children. While the

classic threat category consists of general internet security and privacy threats that do not

exclusively concern SNS users, the modern threats are unique to the SNS environment and are

of special interest for the following analysis. A compendium on privacy threats in SNSs is

provided by the corresponding chapters of the Acatech study [47]. Likewise, Spiekermann et

al. and Weber published papers concerning the future threats for privacy and PD, and for the

corresponding markets [187, 188]. In 2016, Kumar et al. published their risk analysis of SNSs,

containing a list of 20 different security and privacy threats extended with three additional

points for the unique case of health care SNS [189].

Krishnamurthy and Wills published a study on user privacy in SNSs, showing the defective user

enforcement and awareness in using privacy controls in 2008 [190]. Furthermore, Zheleva and

Getoor demonstrated that private information of SNS users can be revealed by interferences

from their network even if their own account is non-public [191]. Bonneau and Preibusch

analysed the privacy practises and policies in SNSs referring to their market behaviour and

profit drive, finding inter alia that the market for privacy in SNSs is “dysfunctional” [89]. In

addition, Zhang et al. found “inherent design conflicts” between their posed security and

privacy goals and SNS usability and sociability [17]. The complex trust issues of user privacy

in SNSs were examined by Shakimov and Cox regarding users’ trust in SNOs, app developers,

and other users in 2011 [192].

Greschbach et al. examined privacy challenges for decentralised SNSs, which are not directly

in the scope of this subchapter but are mentioned for completeness [193]. Furthermore,

discrepancies in the user understanding of privacy policy language, particularly with respect to

data sharing, was found by Reidenberg et al. [108]. They indicated the lack of transparency and

the resulting IA between SNOs and users created by complex legal language. In addition, Henne

4 SNS Privacy and Regulation 63

et al. conducted an online survey and suggesting that user privacy and control should focus on

the privacy of users’ own generated content and on the information revealed by the uploaded

media of other users, especially regarding geo-tagged photos and videos [194]. Finally, chapters

2 and 3.2 show that privacy cannot currently be forced by market competition.

4.1.2 Identified Privacy Threats and Data Types

With the literature review presented before, the following threats to user privacy in SNSs were

identified (see Table 7), as well as their cause-effect relationships (see Figure 10) and the

corresponding types of PD. Moreover, the highlighted cause-effect relationships between the

identified privacy threats were perceived.

Table 7. Identified SNS Privacy Threats

Information asymmetry between the user and the SNO can cause misunderstandings

regarding the use of the collected PD for secondary purposes and the visibility of postings and

messages. Similar misunderstanding occur for limited SNS user ability to put privacy

preferences into effect. While a SNS data leakage as well as a data leakage at third services

which had access to the SNS user data can also lead to unintended audiences of PD or

unauthorized use of PD for other purposes. The results of these incidents can be direct and/or

Threat Description Related literature 1 Information

asymmetry

Unintelligible SNS privacy policies, terms of use, permissive

privacy defaults, and opt-in settings lead to IA between the

SNO and the user about which PD is visible to whom and

used for which purposes.

[47, 89, 187, 194]

2 Limited user

ability

Insufficient SNS privacy controls and misleading user

interfaces limit the ability of users to effectively tune their

privacy settings.

[17, 47, 89]

3 Data leakage Immense PD aggregation and insufficient security by an SNO

can lead to data leakage. This can happen by accident or by

attacks, both from outside (e.g. phishing attacks) or inside the

SNS (e.g. fake-profile friend requests).

[47, 65, 187, 189]

4 Unintended

audiences

Revealed PD is visible for an originally unintended audience.

This can occur due to IA (1) and limited user ability (2).

Negative consequences may be misuse (8), retransmission,

and data leakage (3).

[17, 65, 89, 185]

5 PD use for

secondary

purposes

PD is used by the SNO or third parties for a purpose which

the SNS user originally did not agree on or was not aware of

(e.g. due to IA (1)).

[47, 186–189, 192]

6 Direct data

interferences

UGC from person A in combination with meta data (e.g.

photo geo tags) and/or BD analysis reveals new insights,

which was not intended by her for disclosure.

[187, 188, 194]

7 Indirect data

interferences

Published content from person A reveals information about

person B to third parties without B’s awareness or approval

(e.g. photos of public spaces or events containing person A).

[186, 188, 191, 194]

8 Data

manipulation

Unauthorized reuse of PD (e.g. photos) in a manipulative

context and/or editing or faking PD for harmful purposes.

[17, 185, 186, 189]

64 4 SNS Privacy and Regulation

indirect data interferences which may reveal additional insight into the SNS user and result

in data manipulation, compromising the user (e.g. [195]).

While IA and limited user ability are pure SNS privacy aspects, data leakage is mainly a security

feature. Although the literature discussed above addressed security threats in SNSs and their

possible solutions, the scope of the upcoming framework is solely on privacy threats for SNS

users. A detailed list of SNS security attacks and possible technical solutions is provided e.g. in

the paper of Fire et al. [65]. Finally, the literature analysis revealed the privacy threats listed

above and their cause-effect relationships, as well as the following different PD types. All the

previously listed privacy threats exist to a different extent for all PD types:

a. Direct PD refers to data directly describing the SNS user, often entered by herself (e.g. name,

birthdate, and hometown).

b. Communication data describes data created and used by the SNS users for communication

(e.g. content and recipients of direct messages or postings on timelines and personal walls).

c. Images and videos stand for all photos, images and videos uploaded by users that show users

or sceneries which allow conclusions to be drawn about a user (e.g. locations) [185].

d. Meta data describes all data not directly entered but created by a certain SNS user and related

to her, such as time stamps of postings and messages, a history of articles and postings, login

stamps, and tracked movement on a website [188].

e. Location data is a delicate type of meta data that is listed as an extra data type. It describes

data which tracks the actual or past position of a user and creates a partial or complete

movement profile of her. Examples for location data are GPS coordinates, used Wi-Fi

connections, and geo tags of photos and videos [188, 194].

f. Health data are data which allow conclusions about the mental or physical health of a person.

This can be postings or photos showing current injuries, proving a hospital stay, or describing

Figure 10. Privacy Threat Cause-Effect Relationships.

4 SNS Privacy and Regulation 65

the person’s feelings. Furthermore, data from wearables and fitness trackers showing the

fitness history of a user are considered health data [189].

g. Association data refers to all data which can provide information about a person’s network.

This could be a list of her friends, information about memberships in clubs and associations,

and photos and videos showing the person with other people or at certain events (e.g. a

political demonstration or a union meeting) [186].

4.1.3 Developing the Multidimensional Privacy Framework

To comply with the multidimensionality of user privacy outlined by the literature review above,

the general structure of the privacy framework is divided into three different dimensions. First,

SNS privacy is separated into three different conditions, each parted into two different

classifications. Thus, there are six different privacy classifications. Second, SNS privacy is

divided into seven different types according to Finn et al. [60]. Third, time as dimension is

introduced, sorting privacy decisions and measurements into the points of ex ante and ex post

according to subchapter 3.2.4. Thus, a three-dimensional framework for user privacy in SNS is

constructed (see Figure 11).

In the previously mentioned Acatech study, the authors categorized the privacy threats for SNS

into three major conditions “which have to be fulfilled in order to gain privacy protection”:

awareness, control, and trustworthiness. While “the formation of privacy preferences requires

user awareness of the privacy-relevant aspects of [SNS] policies and their potential impact”,

Figure 11. The Privacy Dimensions.

66 4 SNS Privacy and Regulation

they further state that “the implementation of [user] preferences requires the users to be able to

appropriately control the policies and related processes in OSN that deal with their personal

data [...]”. Finally, the authors find that “trustworthiness of OSN in regard to compliance with

the applicable regulations is required” [47]. These conditions are adopted, including their

definitions, as the first horizontal layer of the privacy framework to address the different privacy

stages an SNS should fulfil for user privacy (see Table 8).

These three conditions can be separated in user-centric approaches provided by the SNO or

another identity and purely SNO-centric approaches. Therefore, the work of Netter et al. and

their classification of SNS privacy research are relevant [138], as the terms and definitions of

user awareness and transparency are used to substantiate the condition of awareness. Netter

et al. are followed, defining user awareness as the SNS user’s “attention, perception, and

cognition of the personal information others have received and how this information is or may

be processed” and corresponding transparency as “the user’s ability to be informed of

processing and dissemination practices” [138]. Thus, user awareness is considered a user-

centric approach based on the individual user’s attention, perception, and cognition which must

be preserved by privacy stimuli either by the SNO or by others [196]. Maintaining transparency

and providing all necessary information about PD collection and processing in a readable and

understandable way is an SNO responsibility. Transparency is useless without user’s interest

in privacy caused by user awareness, which is futile without the transparency to comprehend

the flow of PD.

The control condition is divided into the classifications of user enforcement on the SNS user

side and data sovereignty on the SNO side. Definitions for both are given by Netter et al., who

describe user enforcement as an SNS user’s intention and ability to bring her privacy

preferences into force [138]. In addition, “data sovereignty describes the extent to which an

individual is able to control the processing of his personal data”, where the mechanisms and

control options must be implemented by the SNO [138, 197].

The last condition, trustworthiness, is translated with the complex term accountability and

follows the conceptualization of Zimmermann [198] (see Figure 12). He argued that the

construct of accountability contains a core that includes the aspects of transparency,

controllability, and liability. Controllability requires agreed upon rules to follow for the SNO,

(e.g. given by correspondent legislation). He explained that a system “is transparent if its

behaviour can be observed and it provides information regarding its future behaviour”, it

4 SNS Privacy and Regulation 67

“exhibits controllability if it allows determination whether it followed afore defined rules”, and

it “enables liability if it makes possible sanctions for noncompliance to agreed-upon rules”

[198]. Thus, accountability should enable SNS users and external institutions to comprehend

and control whether the SNO follows imposed policies and to sanction the SNO if she violates

those policies. Accountability is therefore a responsibility for the SNO and for its regulating

identity, while the SNO must provide the possibility of monitoring her actions to guarantee

accountability.

The Zimmermann concept of accountability completes the horizontal scale of the

multidimensional privacy framework, which contains three successive major conditions:

Awareness as a basis for successful control and accountability as a precondition for

trustworthiness. Moreover, awareness is split into user awareness as a basic user requirement

and transparency as a basic SNO requirement for establishing the first privacy condition.

Furthermore, the same is done for the second condition, control, which contains user

enforcement and the provision of data sovereignty on the SNO side. Finally, the third privacy

condition of trustworthiness is translated into monitoring and accountability, the power to

monitor and sanction the actions of the SNO regarding given policies (see Table 8).

Three major conditions for

privacy protection:

Aw

aren

ess

Contr

ol

Tru

stw

ort

hin

ess

Privacy classifications:

Use

r aw

aren

ess

Tra

nsp

aren

cy

Use

r en

forc

emen

t

Dat

a so

ver

eignty

Monit

ori

ng

Acc

ou

nta

bil

ity

Table 8. Horizontal Privacy Conditions and Classifications.

Figure 12. Conceptualization of Accountability as a Privacy Principle (taken from [196]).

68 4 SNS Privacy and Regulation

The vertical dimension of the privacy framework should map the different types of privacy,

primarily but not exclusively for SNSs. Therefore, the categorization from Finn et al. is used,

which extended Clarke’s privacy categorization to seven different types of privacy that fit to

expand the framework on the vertical axis [60]:

Privacy of the person refers to Clarke’s definition of bodily privacy which is related to the

integrity of a person’s body [58]. Threats to this privacy type include “many medical and

surveillance technologies and practices” [60], such as fitness wearables which monitor and

record the physical health of an individual and data stored on a health insurance card. At first,

the privacy of the person and its examples have no intersection with user privacy in SNS.

However, various fitness wearables and their connected applications provide an opportunity to

connect their accounts to FB and other SNS accounts and share their user progress and health

data with friends. In addition, users tend to voluntarily share information about their health in

SNSs (e.g. in online self-help groups or conversations with their relatives and doctors) [199,

200]. Thus, SNSs contain a treasure of health data which is valuable for insurance companies

or potential employees, especially in combination with information about a user’s hobbies and

other health-related information.

For privacy of personal behaviour and action, the extended characterisation of Finn et al. is

used. This privacy type includes “sensitive issues such as sexual preferences and habits,

political activities and religious practices. Hence, anything that unintentionally reveals those

characteristics of a person can be considered a threat to this privacy type” [60]. This privacy

type also includes the visible actions of an individual which can occur in public. Therefore,

Clarke’s restriction is followed, which differs between the random observation of revealing

behaviour in public, and the systematic recording and storage of correspondent information and

data [58]. For SNSs, the first could include accidentally capturing a person in a photo of a public

demonstration that was uploaded by another user. The second represents the (semi)automatic

detection and tagging of that person within the SNS or any other action which reveals the

preferences, habits, and activities through BD analysis.

The term privacy of personal communication stands for the inviolability of all kinds of

communication, verbal and digital, including personal messages in an SNS, video-calls, and

conversations overheard by the SNS’s smartphone application.

4 SNS Privacy and Regulation 69

Similarly, the privacy of data and image is evident, as it contains the privacy of all PD and

images including “concerns about making sure that individuals’ data is not automatically

available to other individuals and organisations” [60]. This includes self-added images and

data as well as images and data added by third parties about a specific SNS user.

The privacy of thoughts and feelings type is more abstract and represents the creative freedom

of individuals and their right to think whatever they like, as well as their right to be sure that

their thoughts and feelings are not revealed to others against their will and without their

knowledge. Threats to this privacy type are closely connected to the privacy of personal

communication and the privacy of personal behaviour and action, because leakage in one can

allow conclusions about someone’s thoughts or feelings. Moreover, leading SNSs such as FB

motivate their users to share their feelings via supplements to their postings or emoticons.

Hence, a targeted compilation of those expressions can detect a user’s feelings and represent

also a threat to this privacy type [201].

Privacy of location and space stands for the right of individuals not to reveal their current

location and their movement profile, and SNSs contain several threats to this type of privacy.

First, users may reveal their past location via negligent postings, messages, photos, and videos.

Second, FB encourages its users to post their location and search for nearby friends, revealing

that location to the SNO and potential advertisers [202]. Third, third parties can resort to this

location data by promoting advertisement to users at a certain location and draw conclusions in

case of their positive response [203]. Finally, SNS applications on smartphones may share the

location data with the SNO which is unnoticed by their user.

Privacy of association “is concerned with people’s right to associate with whomever they

wish, without being monitored” [60].This right is often forgotten but is crucial to a modern

democratic society. It covers the right of individuals to decide whether their membership in

parties or unions is revealed and to whom. Threats to this privacy type by SNSs range from the

simple violation of the privacy of personal communication or data and images to drawn

inferences from meta data analysis which reveal people clustered within the same group

membership.

Until now, the privacy framework has been a compilation of the three privacy conditions from

Buchmann et al. [47] and the core of the four privacy classifications from Netter et al. [138],

hierarchically ordered and supplemented by the accountability construct of Zimmermann [198]

70 4 SNS Privacy and Regulation

on the horizontal axis, and specified by the seven types of privacy of Finn et al. [60] on the

vertical axis. The final dimension added now is time separated in ex ante and ex post as

described subchapter 3.2, which makes the framework unique compared to all former versions.

Ex ante stands for the point in time before the user enters the data or even joins an SNS. For

the privacy type of personal communication, this time frame adapts to the different privacy

classifications as follows: Before the user sends a message, is she aware that the message or the

sense of its content may be revealed to third party (e.g. for marketing purposes)? Does the SNO

provide the transparency, which enables the user to comprehend if and in which matters her

data is analysed and used? Can the user change correspondent settings before sending a message

to prevent data disclosure, and does the SNO provide appropriate controls to guarantee this data

sovereignty to the user? The steps of compliance monitoring and corresponding accountability

can also be fulfilled ex ante, either through signalling by the SNO or screening by external

services (cf. subchapter 3.2.4). However, they are only integrated ex post for the framework

because the actual monitoring and liability of the SNO are only feasible after data flow.

Accordingly, ex post stands for the point in time after a user has entered certain data or joined

an SNS. The example of the privacy of personal communication is useful again, as ex post user

awareness means that the user is conscious that her message could have been passed to third

parties or been mechanically analysed. The transparency classification means that the SNO

provides suitable information in a comprehensible way to the user. Furthermore, ex post user

enforcement stands for the ability of the user to change the appropriate settings afterwards, and

data sovereignty indicates that the SNO must provide suitable tools. Finally, the ex post

perspective allows for the corresponding monitoring of the SNO’s compliance with an ex ante

agreed upon data policy and the SNO’s implementation of the ex post user-made changes. In

the case of violations, the SNO must be held accountable ex post by an external authority (e.g.

a government institution).

4 SNS Privacy and Regulation 71

Table 9 presents the complete multidimensional privacy framework. The three privacy

conditions are each subdivided into two privacy classifications as depicted on the horizontal

axis, and the privacy classifications are subdivided in time as depicted on the vertical axis.

Three conditions for privacy

protection:

Aw

aren

ess

Contr

ol

Tru

stw

ort

hin

ess

Typ

es o

f P

riva

cy

Privacy

classifications: P

oin

t in

Tim

e

Use

r aw

aren

ess

Tra

nsp

aren

cy

Use

r en

forc

emen

t

Dat

a so

ver

eignty

Monit

ori

ng

Acc

ounta

bil

ity

I. Person Ex ante

Ex post

II. Behaviour and Action Ex ante

Ex post

III. Personal

Communication

Ex ante

Ex post

IV. Data and Image Ex ante

Ex post

V. Thoughts and Feelings Ex ante

Ex post

VI. Location and Space Ex ante

Ex post

VII. Association Ex ante

Ex post

Table 9. The Multidimensional Privacy Framework.

72 4 SNS Privacy and Regulation

4.1.4 Framework Range of Coverage

In this section, the main threats to user privacy in SNSs identified in in the literature review are

matched and assigned to the different dimensions of the privacy framework. Which identified

threats and types of PD are covered by which types and classifications of the privacy framework

are shown to demonstrate its range of coverage. The objective is to illustrate that a policy which

fulfils all aspects of the framework will also tackle all major threats for user privacy in SNS

and beyond.

The different types of PD seem to align with the privacy types of the framework: Direct PD (a)

matches with privacy of data and image (IV), and privacy of personal communication (III)

contains communication data (b). Images (c) are also part of the privacy of data and image

(IV), and privacy of location and space (VI) covers location data (e). Health data (f) belongs

to the privacy of the person (I), and privacy of association (VII) coincides with association

data (g). Only meta data (d) cannot be directly assigned to one privacy type. As explained,

meta data covers a wide range of accrued indirect data about a specific SNS user from traced

browser history, tracked cursor movement, and IP detail records. Thus, meta data is contained

in different privacy types, including privacy of the person (I), privacy of personal behaviour

and action (II), and privacy of location and space (VI).

While the diverse data types are reflected in the seven different types of privacy, the various

privacy threats are covered by a combination of the framework’s privacy classifications and the

time points. For clarification, the detailed coverage is explained in the following paragraphs.

Since IA, limited user ability, and data leakage are the causes for the subsequent threats, the

analysis begins with those.

To avoid IA (1), the ex ante and ex post privacy condition of awareness must be fulfilled, and

with it the classifications of user awareness and transparency for all different privacy types.

Moreover, SNS users must be fully aware of and informed about all SNO activities related to

the seven privacy types (and those which might come in the future). All information that is

necessary to comprehend the range of these different types and make an appropriate decision

for their privacy preferences must be provided to the users by the SNO. Correspondingly, the

SNO must provide ex post and ex ante transparency regarding data revelation, collection, usage,

and transfer.

4 SNS Privacy and Regulation 73

To evade limited user ability (2), the ex ante and ex post characteristics of the control

condition, the classifications of user enforcement and data sovereignty, must be complied with

for all seven types of privacy. Thus, SNS users (user enforcement) must be provided with the

appropriate controls (data sovereignty) to state their preferences regarding data disclosure,

collection, use, and transfer for all data types across the different privacy and data types. Finally,

the SNO must be monitored and held accountable for her compliance with the users’ preferences

if necessary.

Although data leakage (3) cannot be fully avoided, an SNO must do everything possible to

secure all data flows with state-of-the-art technology and check that all partners to whom data

is transferred do the same. In addition to ex ante user awareness for the general threat of data

leakage and the responsible handling of their data, the only other privacy classifications

containing the aspect of protection against data leakage are SNO monitoring and

accountability. SNS users and/or a third authority must monitor the SNO for compliance of

data security and hold the SNO accountable in the case violation or carelessness.

To avoid an unintended audience (4) for all types of privacy, the ex ante classifications of user

awareness, transparency, user enforcement and data sovereignty are necessary. The SNS users

must be aware of the possibility of an unintended audience, which is only possible in

combination with correspondent SNO transparency about which data is revealed to whom.

Furthermore, users must control these conditions upfront before revealing their data (user

enforcement), and need appropriate controls provided by the SNO (data sovereignty).

Compliance with all these conditions ex post is also advantageous, but does not include

protection against the initial unintended audience.

The avoidance of PD use for (unintended and unauthorized) secondary purposes (5) must

fulfil the same conditions as the avoidance of an unintended audience (4), except the ex post

aspects are highly relevant. If a user decides later to restrain the use of certain data for secondary

purposes the SNO and the third services that have received that data must obey. In addition to

ex ante privacy classifications, all ex post classifications are relevant including monitoring and

accountability to ensure that the SNO and third services followed the user’s instructions.

The drawing of direct and indirect interferences of PD (6 & 7) can be contemplated together.

Ex ante user awareness, SNO transparency, user enforcement, and data sovereignty are

important for all seven privacy classifications. Furthermore, SNS users must be aware of and

74 4 SNS Privacy and Regulation

have the option to decide whether interferences may be drawn from their entered data and from

data others have entered concerning them. The possibility to change these initial decisions ex

post must also be provided. Moreover, compliance with SNS users’ choices must be monitored,

and the SNO must be held accountable when required, and all aspects of the framework are

affected.

To avoid data manipulation (8), all ex post classifications of the privacy type data and image

(IV) must be complied with. However, because of the rising trend towards using pictures and

videos instead of text for communication and the possibility of communication manipulation

by adding or hiding essential text parts to manipulate the meaning of a message, the conditions

can be extended to the privacy of personal communication (III). Other sorts of data

manipulation are also possible. Health data (f) and the privacy of thoughts and feelings (V)

can be manipulated and misused, and connections that enable conclusions about membership

in associations (VII) can be manipulated or faked. In summary, all seven types of privacy can

be affected by data manipulation, and SNS users must be aware of the threat of reusing their

posted data, images, or messages. A correspondent transparency must be provided by the SNO.

Moreover, users need to be provided with the correspondent controls (data sovereignty) to

comprehend and change the reuse of their data, and must be competent to use them (user

enforcement). Finally, SNS users or a third authority must monitor the SNO for compliance

with these changes and the correspondent data use and data security, and hold the SNO

accountable in the case of violation or carelessness.

Figure 13 summarizes how the identified privacy threats are embracing the multidimensional

framework on different axes. The three conditions for privacy protection on the horizontal axis

and the seven types of privacy on the vertical axis are affected. As described above, the third

axis represents the time one is affected by identified privacy threats ex ante and ex post. Thus,

a privacy policy or legislation fulfilling all fields of the multidimensional privacy framework

has addressed all identified privacy threats including all noted types of PD.

4 SNS Privacy and Regulation 75

Figure 13. Privacy Threats Embracing the Framework.

4.1.5 Framework Discussion

The presented conceptual privacy framework is a qualitative tool for identifying all relevant

privacy types and conditions of classifications for user privacy in SNSs. Based on

corresponding literature reviews, it combines state-of-the-art theories on online privacy with

recent research results and a new and unique approach which includes time as a privacy

dimension. The framework’s field of application is broad, as it can be used to evaluate whether

privacy policies and legislations cover all important aspects of SNS user privacy. Furthermore,

in combination with the identified threats and data types, the framework can be utilised as a

pattern to find and classify privacy threats and resulting attacks for SNS PIAs or as a basis for

SNS PBD approaches.

However, the presented framework does not include a quantitative measurement to display how

extensively the different privacy spheres are fulfilled, and it does not contain technical solutions

76 4 SNS Privacy and Regulation

to the identified privacy threats. Additional quantitative options and a list of technical solutions

were not implemented because that would have gone beyond the initial scope and made the

framework excessively complex. Respective quantitative approaches to online privacy subareas

are provided by Trepte et al. [182], Petkos et al. [183], and Zeng et al. [184], and technical

solutions can be found in the work of Zhang et al. [17], Henne et al. [194], and Fire et al. [65].

The relevance of the presented framework is demonstrated by recent debates about privacy

legislation. The ongoing discussion about the recent European GDPR, its implementation in

national law, and its efficacy to protect user privacy against companies such as SNOs and

against the state highlights the application area of this work [85, 204, 205]. Moreover, privacy

legislations are also up to debate in Australia due to the recently published Data Availability

and Use Report of the Australian Productivity Commission and the included Comprehensive

Right for consumers [206].

SNSs such as FB and other online MSPs such as Google are implementing privacy policies and

controls that are compliant with these regulations [31]. However, it is unclear whether these

provisions fulfil the regulation’s conditions and whether the companies can be monitored and

held accountable if required. This demonstrates another field where the presented privacy

framework could be of use.

4.1.6 Potential Applications and Implications

In this subchapter, a conceptual, multidimensional privacy framework for online privacy for

SNSs was provided. The three dimensions consist of seven different types of privacy, three

different privacy conditions that are each subdivided into two stages and the time of ex ante and

ex post. This concept is based on a combination of prior work on the nature of online privacy

which was identified through a literature review. Furthermore the framework covers the eight

major privacy threats for SNS users for all identified seven types of PD. These threats and data

types were also identified from the literature.

The multidimensional privacy framework can be used to qualitatively evaluate whether privacy

policies and legislation and the corresponding provisions cover all relevant aspects of user

privacy in SNSs. The conceptual framework represents a combination of the state-of-the-art in

privacy research but is not an ultimate and fixed construct. New developments in technology

and social media businesses will make adapting the framework a necessity in the medium term.

4 SNS Privacy and Regulation 77

Some of the identified privacy threats and parts of the privacy framework have already been

addressed by major online companies such as FB and Google with newly implemented privacy

controls and transparency instruments (e.g. FB privacy controls and activity protocol; Google

MyAccount). However, there is still room for improvement regarding the new privacy types

including privacy of thoughts and feelings and privacy of association. Recent incidents, such

as the misuse of a refugee’s selfie with German chancellor Angela Merkel by right-wing

populists on FB [195] and the detection of political preferences such as anti-Semitism on FB

[207], show that not all known privacy threats are satisfactorily addressed. Furthermore, the

presented brain-computer interfaces with FB’s recent developer conference f8 indicates that the

aspects of the privacy of thoughts and feelings will become more important for the future

development of privacy, and that new privacy threats will likely occur [208].

The responsibility of privacy actions emerging from the framework require discussion and

future research. For instance, it is unclear which institution has the jurisdiction and competence

to raise general user awareness and teach the corresponding user enforcement. There are several

points to consider regarding whether this should be done by SNOs, governmental institutions,

or third services and non-governmental organisations (NGOs). The same questions arise for the

responsibility for monitoring and accountability. The opportunity for monitoring must be

provided by the SNO, and the legal framework ensuring the SNO’s liability must be

implemented by national or supranational legislation. However, SNS users still seem unable to

monitor their SNOs, and this is a service which could be provided by third companies, NGOs,

or a state agency. Recently introduced national and supranational privacy regulations such as

the European GDPR and its implementation into the national law of the European Member

States are potential evaluation targets for the presented framework [34].

78 4 SNS Privacy and Regulation

4.2 The EU Approach to Privacy

A credo for the digital community should be “we are all social network members”.

Approximately 67% of worldwide internet users are members of an SNS. At least 50% of the

digital community participates actively in the largest SNS, FB, and there are other large SNSs

including the Chinese platforms QZone and Weibo, and the Russian network VK. Focusing on

the EU, 63% of its internet users and 50% of the total EU population participates in SNSs. Thus,

the topic of user privacy in SNSs is relevant for at least 2.5 billion people worldwide and 206

million in the EU, and these numbers are rising [23, 24, 209, 210].

A promising attempt for privacy regulation in SNSs is provided by the recently enacted EU

GDPR. After more than five years of negotiations, the European Parliament finally adopted the

GDPR on 14 April 2016. On 25 May 2016, it entered formally into force, but the national

legislators have until 25 May 2018 to implement it into their respective national law, due to

various opening clauses that give Member States a margin of discretion in the application of

the regulation [34]. Although the GDPR mainly focuses on data protection, it also contains

regulations in the privacy field. Hence, it is worthwhile to analyse whether the GDPR is able to

regulate the main privacy threats in the SNS businesses.

Consequently, this subchapter analyses the GDPR from an IS and legal viewpoint, focusing on

its impact on privacy in SNSs. In detail, the following analysis examines the GDPR obligations

for SNOs regarding the business to customer (i.e. the SNO to SNS user) privacy relationship.

For simplicity, the challenges of international data transfer as well as data transfer to private

third parties are excluded. The assessment is following the example of discursive technology

assessments in accordance with [211]. First, a short introduction to the GDPR and the

presentation of its privacy relevant sections will be provided. The regulation impacts on SNS

privacy are then examined by using the framework from subchapter 4.1 to examine the relevant

GDPR coverage. Finally, the legal and economical capability of the GDPR, and the technical

feasibility of its measurements will be discussed by means of recent literature and scientific

findings to answer RQ3 about whether the legislation provides a regulatory solution for user

privacy in SNSs and what its influence on the SNS market is (cf. subchapter 1.3).

4 SNS Privacy and Regulation 79

To build the framework for this analysis, apart from the definitions given in subchapter 1.1, the

following legal terms must be clarified. The GDPR definition of PD was presented in

subchapter 1.1.1. Furthermore, the definition of SNS in subchapter 1.1.2 is congruent with the

term filling system, defined by the GDPR as “any structured set of personal data which are

accessible to specific criteria” [34]. Moreover, the given definition of the SNO (cf. subchapter

1.1.3) fulfils in general the role of the recipient “to which the personal data are disclosed”, and

the controller “which determines the purposes and means of the processing of personal data”

according to the GDPR. Finally, if the collected PD is transferred by the SNO to third parties

for processing, those third parties constitute the role of the processor which “processes personal

data on behalf of the controller” [34].

4.2.1 Related Literature

Concerning the legislation, in 1998 Allaert et al. showed various system implications of the

former EU Data Protection Directive [86], and Tan provided an overview of the pre-GDPR

status in 1999 [212]. In 2012 and 2013, various scholars analysed the first GDPR proposal and

its possible effects on data security and privacy [204, 213–217]. A recent analysis of the actual

GDPR was done by Ciriani concerning the economic impacts [84]. Moreover, Kiss and Szoke

as well as Kolah and Foss investigated the impacts of and progresses on data security of the

GDPR in 2015 [218, 219]. Newman examined the GDPR’s right to be forgotten and its

consequences [205] and one year later, Ryz and Grest demonstrated why the GDPR will lead

to a new era of data protection compared to former legislations [85]. Finally, Hansen elaborated

the GDPR provisions of privacy by design and privacy by default and their potential impact on

data processing in Europe and beyond [88]. However, no researcher have explicitly examined

the GDPR impact on privacy in SNSs, and the following work provides a new and unique

analysis approach to the new EU regulation.

80 4 SNS Privacy and Regulation

4.2.2 Recapitulation: The Multidimensional Privacy Framework

As a tool for structuring and evaluating the relevant parts of the GDPR, the multidimensional

privacy framework is used as introduced in subchapter 4.1. It conceptualized privacy as a three-

dimensional construct including three privacy conditions with six different classifications, the

seven different privacy types of Finn et al., and two points in time (i.e. ex ante and ex post)

[60]. The framework is a result of an analysis and merger of the state-of-the-art privacy

definitions and concepts in psychology and information systems focused on SNS. Thus, this

privacy framework is best suited to evaluate whether a legislation such as the GDPR covers all

relevant factors of SNS privacy. The framework’s three privacy conditions and their

specifications are briefly explained in this subchapter to structure the following analysis of the

GDPR. In the subsequent subchapter, the seven privacy types of Finn et al. and the framework’s

two time points are explained in detail and used to evaluate and discuss the findings.

The framework’s major privacy conditions are drawn from the Acatech study by Buchmann et

al. [47]. The authors stated that the three conditions awareness, control, and trustworthiness

“have to be fulfilled in order to gain privacy protection in OSNs” [47]. To specify these three

conditions, they are each divided into a user-centric and an SNO-centric approach (cf. Table 8).

Awareness originally referred to “the users knowing and understanding which personal data

about them is available to the SNS” and how this data is stored, transferred, and/or used [47].

It is subdivided into user awareness, representing the user’s attention, interest, and capability

for this issue, and transparency, illustrating the necessary tools and information provided by

the SNO to understand and comprehend the amount and flow of SNS’s collected data. The user

awareness must be an intrinsic motivation or be triggered by an outside institution such as

school education, governmental, or NGO campaigning while transparency is an SNO duty

which the provider must be held accountable for.

Furthermore, control is originally defined as the opportunity for SNS users to implement their

privacy preferences [47]. It is subdivided into user enforcement, representing user’s skills to

translate their privacy preferences into actions and implement them with the corresponding SNS

options which are illustrated with data sovereignty. Data sovereignty stands for the necessary

tools provided by the SNO for users to implement their privacy preferences, and for the range

of coverage of those tools in terms of controlling, correcting, and deleting user data at the user’s

4 SNS Privacy and Regulation 81

will. The first condition of user enforcement does not need to be taught by the SNO, as it can

be taught in schools, other institutions, or by self-learning.

The last condition represents the SNO’s trustworthiness “to implement an appropriate level of

privacy protection”. However, as several researchers have stated, there is a conflict between the

SNO’s goal of increasing profits and the duty to implement privacy protection [29, 31, 220].

Therefore, the condition is subdivided into the classifications of monitoring and accountability.

The apportionment breaks with the user-centric and SNO-centric order here. Monitoring

represents the SNO’s obligation to disclose its actions transparently and comprehensibly to the

correspondent supervision for accountability, and the opportunity to sanction an SNO for

noncompliance to agreed-upon rules. The first requires an authority which is authorized and

can inspect the data flows of an SNS without threatening its business secrets and

competitiveness. The second requires comprehensible rules or legislations for SNSs that are

stated and enforced by a higher authority.

4.2.3 The GDPR: Genesis, Structure and Aim

This subchapter provides an overview of the legislation components that are relevant to SNS

privacy. The privacy-relevant articles of the GDPR are assigned to the three conditions of

privacy identified in the multi-dimensional privacy framework described above: awareness,

control, and trustworthiness.

The GDPR replaces Directive 95/46/EC (the Data Protection Directive) and harmonises data

protection and privacy law within the EU. Its major achievements regarding privacy law are the

extension and strengthening of existing rights of the data subject, and the creation of new rights

such as the “right to data portability”[Art.20] and the much-debated right to be forgotten

[Art.17]. The GDPR applies to EU companies, and to controllers and processors that are not

established in the EU, where “it is apparent that the controller or processor envisages offering

services to data subjects in one or more Member States in the Union” [34], or when the

processing is related to monitoring the behaviour of such data subjects in so far as their

behaviour occurs within the Union (see Art.3). Therefore, international SNSs such as FB,

Twitter, and YouTube are also targeted.

82 4 SNS Privacy and Regulation

4.2.3.1 Awareness and Transparency: Information Obligations of the Controller and

Processor

The first privacy condition is awareness divided into user awareness and transparency,

describing the pre-condition for the exercise of rights attributed to the data subject respectively

the SNS user. While, as described above, awareness relates to an intrinsic motivation of the

data subject – which cannot be claimed, but only be supported by the legislator, transparency

corresponds to the duty of the controller and the processor and can be enforced by imposing

information obligations. In fact, increased information leads to knowledge of the data subject

and eventually enhanced user awareness.

The GDPR sets out its core principles in Art.5. The importance and value of transparency as a

principle relating to the processing of PD is demonstrated by the fact that it is listed, together

with lawfulness and fairness, at the very beginning of Art.5, stating that PD shall be “processed

lawfully, fairly and in a transparent manner in relation to the data subject” [Art.5.a].

Art.6 is a central provision that also touches on transparency aspects. According to Art.6.1.a,

“processing shall be lawful only if and to the extent that […] the data subject has given consent

to the processing of his or her personal data for one or more specific purposes”. The informed

consent of the data subject generally is a precondition for the lawfulness of data processing.

However, in the absence of consent of the data subject, Art.6 allows the data processing under

certain other circumstances. Art.7.2 sets out the conditions for consent and stipulates that “if the

data subject's consent is given in the context of a written declaration which also concerns other

matters, the request for consent shall be presented in a manner which is clearly distinguishable

from the other matters, in an intelligible and easily accessible form, using clear and plain

language”. According to Art.7.3, the data subject shall be informed of the fact that “the

withdrawal of consent does not affect the lawfulness of processing based on consent before its

withdrawal”. Art.8 concerns the conditions applicable to child’s consent in relation to

information society services.

The key article regarding awareness and transparency is Art.12. Art.12.1 stipulates that

controllers need to provide any information “in a concise, transparent, intelligible and easily

accessible form, using clear and plain language”. In addition, Art.12.3 obliges the controller

“to provide the information without undue delay and in any event within one month of receipt

of the request of the data subject”. According to Art.12.4, the controller needs to inform the data

4 SNS Privacy and Regulation 83

subject of any reasons for not taking action and on the possibility of lodging a complaint and

seeking a judicial remedy. The information must be provided free of charge if the request is not

manifestly unfounded or excessive, Art.12.5.

Articles 13 and 14 concern the information duties of the controller, on the one side where PD

is collected from the data subject [Art.13], on the other hand where PD has been obtained

indirectly by thirds [Art.14]. The information obligations are relatively extensive and cover,

inter alia, the identity and contact details of the controller [Art.13.1.a], the purposes of the

processing for which the PD is intended as well as the legal basis for the processing [Art.13.1.c],

“the legitimate interests pursued by the controller or by a third party” [Art.13.1.d] and the

recipients or categories of recipients of the PD [Art.13.1.e]. In case “the controller is able to

demonstrate that it is not in a position to identify the data subject, the controller shall

nevertheless inform the data subject accordingly, if possible” [Art.11.2].

The controller is also obliged to inform the data subject – “at the latest at the time of the first

communication with the data subject” – about the “right to object to the processing of PD”

according to Art.21.1 and 2. This information must be presented clearly and separately from

any other information [Art.21.4]. Another reference to transparency can be found in the

definition of consent in [Art.4.11]: According to this article, consent “means any freely given,

specific, informed and unambiguous indication of the data subject's wishes”.

These provisions target the controller and processor’s duty to fully inform the data subject about

the content and amount of the data which is collected and fall under the transparency aspect of

the framework. They safeguard the data subject’s right to information and thereby enable her

in the first place to enforce her rights. Indirectly, they also touch the framework’s condition of

privacy awareness, because an increase in information of the data subject necessarily leads to

greater attention and care about own rights. In addition, also Art.57 should be mentioned. It

regulates the tasks of Independent supervisory authorities, provided by the correspondent EU

member states. In particular, Art.57.1.b mentions the duty to “promote public awareness“.

Hence, the responsibility to raise user privacy awareness is given into state hands and is no

obligation of the data controller or the processor.

84 4 SNS Privacy and Regulation

4.2.3.2 User Enforcement and Control: The Data Subject’s Rights

The GDPR not only contains obligations on the controller and processor, but also inversely

encloses rights of the data subject to be informed. In fact, the concept of privacy reveals itself

in law through the attribution of certain enforceable rights to individuals, in the case of the

GDPR through the attribution of rights to “data subjects”. All the rights of data subjects

enshrined within the GDPR flow from the fundamental right of the protection of PD as

established in Article 8 of the Charta of Fundamental Rights of the EU (CHFR) as well as

Article 16.1 of the Treaty on the Functioning of the European Union (TFEU). It is important to

note that the GDPR particularly protects special categories of PD. First, the processing of

certain highly sensitive data such as PD revealing racial or ethnic origin is generally forbidden

[Art.9.1] and in particular protected (e.g. Art.6.4.c, 17.1.b, 20.1.a and 22.4). Second, the

processing of PD relating to criminal convictions and offences is also given special protection

[Art.10].

The GDPR dedicates an entire chapter [Chapter III, Art.12-23] to the rights of the data subject,

which is subdivided into sections on transparency and modalities, information and access to

PD, rectification and erasure, the “right to object and automated individual decision-making”,

and restrictions. Articles 12 to 14 address transparency and information obligations of the

controller and therefore have been considered afore. However, it is worth noting that according

to Art.12.2, the controller shall facilitate the exercise of data subject rights.

The data subject’s rights start in Art.15 with the right of access, which lays down that “the data

subject shall have the right to obtain from the controller” confirmation as to whether or not PD

concerning her are being processed. Further, where that is the case, access to the PD must be

given as well as certain other enumerated information such as the purposes of the processing

and the categories of PD concerned, the recipients to whom the data have been or will be

disclosed or the period for which the PD will be stored. In addition, according to Art.15.2, “the

data subject shall have the right to be informed of the appropriate safeguards” where PD is

transferred to a third country or to an international organisation. Art.15.3 sets out the right to

obtain a copy of the PD undergoing processing. Art.16 gives data subjects the right to obtain

from the controller without undue delay the rectification of inaccurate PD concerning her and

the right to have incomplete PD completed.

4 SNS Privacy and Regulation 85

One of the most-debated rights is the “right to erasure” laid down in Art.17, which is more

commonly known as right to be forgotten. It contains both a “right to erasure” for the data

subject as well as an obligation to erase for the controller when certain enumerated grounds

apply. This article has undergone major changes during the legislation procedure. The first

proposal provided for a full right to be forgotten, which was then changed to a right of erasure

until the adoption. Only in Art.17.2 one can still see roots of the concept of the right to be

forgotten when the controller, when it has made the PD public and is obliged pursuant to

paragraph 1 to erase the data, shall take reasonable steps “to inform controllers which are

processing the personal data that the data subject has requested the erasure by such controllers

of any links to, or copy or replication of, those personal data”. The concept of such a right had

already been enshrined in Art.12 of the Data Protection Directive and it was the European Court

of Justice, which, ruling on this Directive in May 2014, gave individuals the right – under

certain conditions – to ask search engines to remove links with personal information about them

(Case C-131/12 - Google Spain). According to the court, this applies where the information is

inaccurate, inadequate, irrelevant or excessive (para 93 of the ruling). Art.17.3 lays down

limitations of the “right to erasure” when the processing is necessary for certain reasons such

as the exercise of the right of freedom of expression and information.

Art.18 attributes the “right to restriction of processing” under certain conditions, also subject

to limitations. Art.19 obliges the controller to communicate any rectification or erasure of PD

or restriction of processing “to each recipient to whom the personal data have been disclosed,

unless this proves impossible or involves disproportionate effort”. Upon request of the data

subject, “the controller shall also inform the data subject about those recipients”.

Art.20 lays down the new “right to data portability”, giving data subjects the right to receive

the PD which she has provided to a controller in a structured, commonly used and machine-

readable format and giving the right to transmit those data to another controller without

hindrance from the controller for which the data have been provided. Where technically

feasible, “the data subject shall have the right to have the PD transmitted directly from one

controller to another” according to Art.20.2.

Art.21 awards the “right to object”, on grounds relating to her particular situation, at any time

to processing of PD which is either necessary “for the performance of a task carried out in the

public interest or in the exercise of official authority vested in the controller” of which is

necessary for the purposes of “the legitimate interests pursued by the controller of by a third

86 4 SNS Privacy and Regulation

party”. “The controller is no longer allowed to process the personal data unless the controller

demonstrates compelling legitimate grounds for the processing which override the interests,

rights and freedoms of the data subject or for the establishment, exercise or defence of legal

claims” [Art.21.1]. Art.21.2 expressly mentions the “right to object” where PD are processed

for direct marketing purposes, which includes “profiling to the extent that it is related to such

direct marketing”. Art.21.6 enables the data subject to object to processing of data for scientific

or historical research purposes “or statistical purposes, unless the processing is necessary for

the performance of a task carried out for reasons of public interest”. Finally, Art.22 gives data

subjects the right not to be subject to a decision based solely on automated processing, including

profiling, which produces legal effects concerning her or similarly significantly affects her.

Certain very sensitive data is particularly protected according to Art.22.4.

According to Art.26.3, where there are joint controllers as defined in Art.26.1, the data subject

is entitled to “exercise his or her rights under the Regulation in respect of and against each of

the controllers”.

In addition, articles 6, 7 and 8 which have been discussed above can also be named here. They

do not contain data subject’s rights, but set out that the processing of PD generally requires the

data subject’s informed consent and enumerate the conditions for a lawful processing of PD.

According to Art.7(2), the data subject has the right to withdraw her consent at any time, which

explicitly does not have an effect on the lawfulness of processing based on consent before its

withdrawal.

4.2.3.3 Monitoring and Accountability: Provisions on remedies, liability and penalties

as “flanking measures”

The following sections focus on the third layer of the privacy framework, which deals with the

trustworthiness of the SNO “to implement an appropriate level of privacy protection” [47]. As

described before, this privacy condition can be further subdivided into the privacy

classifications monitoring and accountability. Where accountability describes the possibility

to sanction an SNO for non-compliance to agreed-upon rules, monitoring focuses on the SNO’s

obligation to disclose its actions transparently and comprehensibly to a supervision authority.

Both aspects are incorporated within the GDPR.

4 SNS Privacy and Regulation 87

In order to effectively enforce her privacy rights, the data subject needs regulations on remedies,

liability and penalties. The general principle of accountability of the controller is laid down in

Art.5.2: “The controller shall be responsible for, and be able to demonstrate compliance with,

paragraph 1 (‘accountability’).” Art.5.1 enumerates the GDPR’s main principles relating to the

processing of PD: lawfulness, fairness and transparency, purpose limitation, data minimisation,

accuracy, storage limitation and integrity and confidentiality.

Furthermore, the Chapter IV GDPR is dedicated to the controller and the processor. The first

article of this chapter, Art.24, addresses the responsibility of the controller as a general

obligation: “the controller shall implement appropriate technical and organisational measures

to ensure and to be able to demonstrate that processing is performed in accordance with this

Regulation. Those measures shall be reviewed and updated where necessary.” Unless

disproportionate to the processing activities, the referred measures “shall include the

implementation of appropriate data protection policies by the controller” according to Art.24.2.

Art.24.3 further stipulates that “adherence to approved codes of conduct as referred to in Article

40 or approved certification mechanisms as referred to in Article 42 may be used as an element

to demonstrate compliance with the obligations of the controller”.

Another central provision is Art.25, which obliges the controller to “implement appropriate

technical and organisational measures, such as pseudonymisation, which are designed to

implement data-protection principles, such as data minimisation, in an effective manner and to

integrate the necessary safeguards into the processing in order to meet the requirements of this

Regulation and protect the rights of data subjects.” In doing so, the controller shall take account

“the state of the art, the cost of implementation and the nature, scope, context and purposes of

processing as well as the risks of varying likelihood and severity for rights and freedoms of

natural persons posed by the processing” [Art.25.1]. In addition, Art.25.2 obligates the

controller to ensure that only the PD “which are necessary for each specific purpose of the

processing are processed”. This obligation is quite extensive as Art.25.2 clarifies that this

obligation “applies to the amount of personal data collected, the extent of their processing, the

period of their storage and their accessibility”. Finally, sentence three of that paragraph deals

with default settings and requires that “such measures shall ensure that by default personal data

are not made accessible without the individual's intervention to an indefinite number of natural

persons”.

88 4 SNS Privacy and Regulation

Art.26 concerns the special case of joint controllers, i.e. “two or more controllers jointly

determining the purposes and means of processing”, and is mentioned for completeness. The

GDPR does not only consider the controller and it responsibilities, but also the processor (cf.

introduction of 4.2). Articles 28 and 29 deals with the specific obligations imposed on the

processor and on the controller when it delegates the processing to a processor. The controller

and the processor are furthermore obliged to “implement appropriate technical and

organisational measures to ensure a level of security appropriate to the risk” as set out in detail

in Art.32, including, inter alia, “the pseudonymisation and encryption of personal data”

[Art.32.1.a].

All the articles mentioned above concern the accountability aspect of trustworthiness as

defined in the privacy framework. They clarify that the controller and processor can be held

liable for infringements of the data subject’s rights. The GDPR also contains provisions on the

monitoring privacy classification, dealing with the SNO’s obligation to disclose its actions to

the supervisory authority. According to Art.30.1 the controller “shall maintain a record of

processing activities under its responsibility” which must contain a list of information such as,

for example, “the name and contact details of the controller and, where applicable, the joint

controller, the controller's representative and the data protection officer” (DPO) as well as the

purposes of the processing. According to Art.30.2, similar obligations are imposed on the

processor. The record must be made available to the supervisory authority on request [Art.30.4].

It is important to note that the obligation to maintain a record of processing activities does –

subject to certain exceptions – “not apply to an enterprise or an organisation employing fewer

than 250 persons” [Art.30.5]. The controller, the processor and their representatives are under

a duty to cooperate with the supervisory authority on request [Art.31].

Of course, the controller is also obliged to notify any PD breach to the competent supervisory

authority, “unless the personal data breach is unlikely to result in a risk to the rights and

freedoms of natural persons” [Art.33.1]. Such a notification must be performed, “where

feasible, within 72 hours after having become aware of the data breach” and when this time

has lapsed, the notification must be accompanied by the reasons for the delay [Art.33.1]. The

minimum information which must be contained within the notification are set out in Art.33.3.

According to Art.33.4, the controller “shall document any personal data breaches, comprising

the facts relating to the personal data breach, its effects and the remedial action taken” to

“enable the supervisory authority to verify compliance with this Article.”

4 SNS Privacy and Regulation 89

The monitoring privacy classification also exists in Art.34 which obliges “the controller to

communicate the personal data breach to the data subject without undue delay when the

personal data breach is likely to result in a high risk to the rights and freedoms of natural

persons.” Paragraph 3 of Art.34 sets out important exceptions from this obligation for the

controller. For example, where “the controller has taken subsequent measures which ensure

that the high risk to the rights and freedoms of data subjects referred to in paragraph 1 is no

longer likely to materialise”, the notification of the data subject is not required [Art.34.3.b].

According to Art.34.4, the supervisory authority can – having considered the likelihood of the

PD breach resulting in a high risk – require the controller to communicate the breach to the data

subject.

Section three of the chapter on the controller and the processor address the data protection

impact assessment (DPIA) [Art.35] and prior consultation [Art.36]. A DPIA is required by the

controller prior to the processing “where a type of processing in particular using new

technologies, and taking into account the nature, scope, context and purposes of the processing,

is likely to result in a high risk to the rights and freedoms of natural persons” [Art.35.1]. This

“assessment examines the impact of the envisaged processing operations on the protection of

personal data”. Such a DPIA is in particular required when certain very sensitive PD is being

processed or in cases of a systematic and extensive evaluation of personal aspects relating to

natural persons which is based on automated processing, including profiling (see Art.35.3).

Paragraph 7 of Art.35 lists the minimum information which the assessment needs to contain.

According to Art.35.11, the controller shall, where necessary, “carry out a review to assess if

processing is performed in accordance with the data protection impact assessment at least when

there is a change of the risk represented by processing operations.”

Art.36 on prior consultation also falls under the monitoring privacy classification. Where a

DPIA under Art.35 indicates that “the processing would result in a high risk in the absence of

measures taken by the controller to mitigate the risk”, the controller shall consult the

supervisory authority prior to processing [Art.36.1]. The authority, where it is of the opinion

that the intended processing referred to in paragraph 1 would infringe the Regulation, is

required, within eight weeks, to “provide written advice to the controller and, where applicable

to the processor, and may use any of its powers referred to in Article 58” [Art.36.2].

Section IV of Chapter IV on the controller and the processor covers provisions on the DPO.

Art.37 lays down the conditions under which a DPO must be designated as well as other aspects

90 4 SNS Privacy and Regulation

surrounding the designation. Art.38 addresses the position of the DPO, while Art.39 deals with

her tasks. Articles 40 to 43 provide for the possibility to draw up and monitor codes of conduct

as well as data protection certification mechanisms and relating certification bodies.

The provisions in Chapter V on transfers of PD to third countries or international organisations

will not be dealt with in detail as they do not directly concern the relationship between the data

subject and the company which processes the data. However it is important to note that “any

transfer of personal data which are undergoing processing or are intended for processing after

transfer to a third country or to an international organisation shall take place only if, subject

to the other provisions of this Regulation, the conditions laid down in this Chapter are complied

with by the controller and processor, including for onward transfers of personal data from the

third country or an international organisation to another third country or to another

international organisation” [Art.44].

Chapter VI contains provisions on the independent supervisory authorities. It will not be dis-

cussed in detail as for the purposes of this analysis it is only important to know that such rules

exist. They too fall under the monitoring privacy classification of the privacy framework as

they establish the supervisory authorities to which the SNO is obliged to report. Art.58 is

important and lays down the powers of the supervisory authorities (investigative, corrective,

authorisation, and advisory powers, and additional powers provided for by national law).

Chapter VII regulates the relationship between different authorities and the newly created

European data protection board.

Finally, Chapter VIII of the GDPR contains rules on remedies, liability and penalties in order

to assure that the proclaimed rights and obligations are not an empty word. The relevant

provisions accordingly fall under the accountability privacy classification within the privacy

framework. Art.77 lays down the right to lodge a complaint with the supervisory authority if

the data subject considers that the processing of PD relating to her infringes the regulation.

Furthermore, the regulation stipulates that the data subject has a right to an effective judicial

remedy, both against the supervisory authority as well as against the controller or processor

[Art.78 and 79]. The GDPR also provides for an action for failure to act in Art.78.2. Art.80.1

gives the data subject “the possibility to mandate a not-for-profit body, organisation or

association”, which complies with certain conditions to lodge the complaint on her behalf, to

exercise the rights referred to in articles 77 – 79 on her behalf, and to exercise the right to

receive compensation referred to in Art.82 on her behalf where provided for by member state

4 SNS Privacy and Regulation 91

law. This is a major novelty compared to the existing law. Member States may provide that

anybody, organisation or association referred to in Art.80.1 “independently of a data subject's

mandate, has the right to lodge a complaint with the supervisory authority and to exercise the

rights referred to in Articles 78 and 79” .

Art.82 governs the right to compensation and liability in the case of violations of the regulation:

Compensation must be granted for material as well as non-material damages. The different

responsibilities of controller and processor are taken account of by the fact that the processor

will be liable for the damage caused by processing only where it has not complied with

obligations of the GDPR specifically directed to processors or where it has acted outside or

contrary to lawful instructions of the controller [Art.82.2]. Any controller however who is

“involved in processing shall be liable for the damage caused by processing which infringes

this Regulation” [Art.82.2]. Processor and controller can only escape liability if they are able

to prove that they are under no circumstances responsible for the event giving rise to the damage

[Art.82.3], for example by providing evidence that appropriate technical and organisational

measures have been implemented to ensure and to be able to demonstrate that processing is

performed in accordance with the regulation [Art.24.1][221].

In addition, where the controller and/or the processor infringe their obligations pursuant to

articles 8, 11, 25 to 39 and 42 and 43, they shall “be subject to administrative fines up to €

10,000,000, or in the case of an undertaking, up to 2 % of the total worldwide annual turnover

of the preceding financial year, whichever is higher”. Infringements of, inter alia, the data

subject’s rights pursuant to articles 12 to 22 are sanctioned with “fines up to € 20,000,000, or

in the case of an undertaking, up to 4% of the total worldwide annual turnover of the preceding

financial year, whichever is higher” [Art.83.5.b]. The same applies to the violation of the

provisions on “the basic principles for processing, including conditions for consent pursuant to

Articles 5, 6, 7 and 9, the transfer of personal data to a recipient in a third country or an

international organisation pursuant to Articles 44 to 49 as well as some other enumerated

cases” [Art.83.5]. In other cases, such as the infringement of the obligations to notify the

authority of a data breach, controllers and processors face sanctions “up to € 10,000,000 or in

the case of an undertaking, up to 2% of the total worldwide annual turnover of the preceding

financial year, whichever is higher” [Art.83.4]. Art.83.3 stipulates that where a controller or

processor, “for the same or linked processing operations, infringes several provisions of this

Regulation, the total amount of the administrative fine shall not exceed the amount specified

92 4 SNS Privacy and Regulation

for the gravest infringement.” In addition, “non-compliance with an order by the supervisory

authority as referred to in Article 58(2)” is subject to administrative fines up to € 20,000.000,

or in the case of an undertaking, up to 4 % of the total worldwide annual turnover of the

preceding financial year, whichever is higher, Art.83.6. According to Art.84, Member States are

allowed to adopt additional penalties for infringements of the regulation, especially for

infringements that are not subject to administrative fines pursuant to the regulation’s provisions,

which they must notify to the European Commission until 25 May 2018.

4.2.4 Interim Result: Conditions and Classifications met by the GDPR

In the previous subchapters all privacy relevant GDPR articles were listed and explained as

well as sorted correspondent to the three different privacy conditions awareness, control and

trustworthiness. As one can see in Table 10 the articles covering privacy awareness are almost

exclusively related to transparency. The articles 4.11, 5, 6.1, 7, 8, 11.2, 12 – 14, 15.2 and 21.1,

21.2, 21.4 are all targeting to increase transparency for the data subjects respectively the SNS

users. But only Art.57.1.b defines the task of independent supervisory authorities to raise user

awareness. In addition, the release of the GDPR itself, the correspondent media coverage and

the by increased transparency caused user attention can also be interpreted as indirect increases

in user awareness [222].

Regarding control the articles 6 – 10, 12.2 and 15 – 22 handle the data subjects’ respectively

the SNS users’ rights and, therewith, their data sovereignty. The covered rights range from the

“right to access” over the “right to data portability” to the “right to object”. However, except

for Art.12.2 stating that “the controller shall facilitate the exercise of data subject rights under

Articles 15 to 22” and Art.15.2 and its right for data subjects to be informed of appropriate

safeguards when data is transferred to a third country or international organisations, there are

no user enforcement measures within the legislation articles.

Finally, regarding the privacy condition of trustworthiness, the articles 24, 30, 31, 33 –36, 40

– 43, 58 as well as 77–79 and 80.1 cover the aspect of monitoring with detailed rights for the

data subject respectively the SNS user to lodge a complaint and the right to an effective judicial

remedy. However, monitoring is only provided if complaints are plead. Additionally, the GDPR

does encourage the establishment of data protection certification mechanisms (e.g. an algorithm

TÜV [223]) on the member states levels. Concerning accountability, Art.5.2 laid down the

general principle of accountability and the articles 77 – 79 and 80 as well as 82 – 83 govern the

4 SNS Privacy and Regulation 93

right to lodge a complaint, the right to an effective judicial remedy as well as the right to

compensation and liability and with that the extent of financial penalties.

Three major conditions for privacy protection:

Privacy classifications:

GDPR Articles:

Awareness User awareness 57.1.b

Transparency 4.11; 5.1; 6.1; 7; 8; 11.2; 12; 13; 14; 15.2; 21.1; 21.2; 21.4;

Control User enforcement 12.2; 15.2; Data sovereignty 6 – 10; 12.2; 15 – 22

Trustworthiness Monitoring 24; 30; 31; 33 – 36; 40 – 43; 58; 77 – 79; 80.1; Accountability 5.2; 77 – 79; 80; 82 – 83;

Table 10. Privacy Conditions and Classifications met by the GDPR.

4.2.5 Evaluation: Does the GDPR cover it all?

In this subchapter the coverage and the sustainability of the GDPR is evaluated. Therefore, the

second and third dimensions of the privacy framework (cf. subchapter 4.1.3) are firstly used,

consisting of the seven privacy types of Finn et al. [60] and the two points in time ex ante and

ex post. Finally, a legally and an economic assessment is provided, concerning the binding and

sustainability of the legislation.

4.2.5.1 The Privacy Types contained within in the GDPR

Finn et al. extended the privacy categorization of Clarke and distinguished between seven

different types of privacy [58, 60]. In the following paragraphs those different types are defined

briefly and it is shown whether, where and how they are covered within the GDPR. The findings

are then summarized in Table 11.

Privacy of the person is related to the integrity of a person’s body. Corresponding threats for

this privacy type are “many medical and surveillance technologies and practices” [60]. The

Art.4 of the GDPR defines the corresponding data as “genetic data” [Art.4.13], or “data

concerning health” [Art.4.15]. Mostly relevant is Art.9.1, which prohibits among others the

processing of genetic, biometric or health data. In addition, Art.9 data is specifically protected

in the context of automated individual decision making, including profiling [Art.22.4].

Regarding processing which is necessary for, inter alia, “purposes of preventive or occupational

medicine” [Art.9.2.h], paragraph 3 foresees an additional safeguard by requiring the “data to

be processed by or under the responsibility of a professional subject to the obligation of

94 4 SNS Privacy and Regulation

professional secrecy”. In addition, paragraph 4 allows Member States to “maintain or introduce

further conditions, including limitations, with regard to the processing of genetic data,

biometric data or data concerning health”. The “right to erasure” [Art.17] also contains a list

of exceptions in paragraph 3. The controller is, inter alia, not obliged to erase the PD “to the

extent the processing is necessary for reasons of public interest in the area of public health”

[Art.17.3.c]. The same exception (also among many others) is provided regarding processing

which is necessary for purposes of preventive or occupational medicine [Art.9.2.h]. Paragraph

3 foresees an additional safeguard by requiring the data to be processed by or under the

responsibility of a professional subject to the obligation of professional secrecy. In addition,

paragraph 4 allows Member States to “maintain or introduce further conditions, including

limitations, with regard to the processing of genetic data, biometric data or data concerning

health”. The “right to erasure” [Art.17] also contains a list of exceptions in paragraph 3. The

controller is, inter alia, not obliged to erase the PD “to the extent the processing is necessary

for reasons of public interest in the area of public health” [Art.17.3.c]. The same exception

(also among many others) is provided in the general restriction Art.23.1.e. This means that the

controller has the possibility to invoke these exceptions in order to process the data or to repel

a claim of the data subject. Finally, health data is also mentioned in Art.36.5 regarding prior

consultation to processing and Art.88.1 when it comes to the special case of data processing in

the context of employment. According to Rec.54, “public health” should be interpreted broadly

(reference is made to the definition in Regulation (EC) No 1338/2008 of the European

Parliament and of the Council). However, Rec.54 emphasizes that “such processing of data

concerning health for reasons of public interest should not result in personal data being

processed for other purposes by third parties such as employers or insurance and banking

companies”. In summary, the legalisation takes the privacy of the person strongly into account

with the general prohibition of processing any genetic, biometric or health data, but opens

questionable loop holes with the provided list of exceptions.

Privacy of personal behaviour and action is characterized by Finn et al. as “sensitive issues

such as sexual preferences and habits, political activities and religious practices” [60]. Again

Art.9.1 general prohibits the processing of PD “revealing racial or ethnic origin, political

opinions, religious or philosophical beliefs, or trade union membership […] or data concerning

a natural person's sex life or sexual orientation” with the aforementioned exceptions (in

particular Art.9.2 – 3). Thus, this privacy type is covered by the legislation with the same range

as the aforementioned privacy of the person.

4 SNS Privacy and Regulation 95

Privacy of personal communication stands for the inviolability of all kinds of communication,

may they be verbal or digital. This privacy type is no directly mentioned in the core legislation

but in the recitals. According to Rec.4, the GDPR recognises the fundamental right to respect

for private and family life, home and communication as recognised in the Charter of

Fundamental Rights and in the Treaties of the European Union. However, the different types of

communications are not further defined, besides the general abstract level.

Privacy of data and image contains the privacy of all sorts of PD and images including

“concerns about making sure that individuals’ data is not automatically available to other

individuals and organisations” [60]. It represents the classical data privacy understanding for

which the legislation is originally designed. As mentioned above, Art.4.1 gives a clear definition

of PD. Moreover, Art.4.14 adds the definition of “biometric data” including “facial images”

which enable the unique identification of natural persons. Thus, the privacy of data and image

is fully covered.

Privacy of thoughts and feelings embodies the creative freedom of individuals and their right

to think whatever they like, as well as their right to be sure that their thoughts and feelings are

not revealed to others against their will and without their knowledge. References to this privacy

type can be found again Rec.4, which reads: “This Regulation respects all fundamental rights

[…] in particular […] the freedom of thought”. Furthermore, the freedom of thought aspect of

this privacy type is covered by Art.9.1 which prohibits the processing of “data revealing […]

political opinions, religious or philosophical beliefs”. However, no further references or

definitions regarding the freedom of thought or the privacy of thoughts and feelings or the

privacy of a natural person’s psychology can be found. It is up to discussion if the current GDPR

is able to protect this privacy type appropriately.

Privacy of location and space encompasses the right of individuals not to reveal their current

location and their motion profile. The legislation takes it into account with Art.4.1 where

location data is mentioned as a possible identifier for a natural person. Furthermore, Art.4.4

explicit refers to a natural person’s location and movements when it defines the extent of

profiling. Hence, with location and movement data explicitly mentioned, the privacy type of

location and space is within the GDPR range.

Privacy of association “is concerned with people’s right to associate with whomever they

wish, without being monitored” which is crucial to modern democratic society [60]. It is

96 4 SNS Privacy and Regulation

explicitly included within the Art.9.1 which defines special, very sensitive categories of PD.

The processing of such data is prohibited aside from the already mentioned exceptions. Art.9.1

covers political opinions, religious or philosophical beliefs and trade union membership which

represents a list of the most sensitive types of associations for this type of privacy.

In summary, the GDPR is foremost designed to protect the privacy of data and image, including

personal communication and location data, as well as the privacy of the person including in

particular health data. Moreover, the privacy of personal behaviour and action as well as the

privacy of association are also covered to some degree. However, the legislation does little

address the privacy of thoughts and feelings (see Table 11).

Privacy Types Relevant GDPR Articles

Privacy of the person Rec. 35; 53; 54;

Art. 4.13; 4.15; 9; 17.3.c; 23.1.e; 36.5; 88.1

Privacy of personal behaviour and action Art. 9

Privacy of personal communication Rec. 4

Privacy of data and image Art. 4.1; 14

Privacy of thoughts and feelings Art. 9.1, Rec. 4

Privacy of location and space Art. 4.1; 4.4

Privacy of association Art. 9

Table 11. Privacy Types covered by the GDPR.

4.2.5.2 The Time Dimension

As already mentioned, the multi-dimensional privacy framework distinguishes between the two

different points in time ex ante and ex post for privacy to be enforced (cf. subchapter 4.1.3).

Ex post privacy is the enforcement of the user’s privacy preferences after she joined a service.

In the case of SNS this represents the usage of privacy options and/or dashboards to control the

usage and delete or correct data which has already been exposed (e.g. changing the audience of

an existing posting or deleting it). One characteristic for ex post privacy is that the SNO has

already received and processed the data before the user is able to make a decision about the data

usage. This leads to a partial illusion of privacy, because the original audience who had access

to the data and the SNO are able to keep unnoticed a copy of the deleted or edited data.

Ex ante privacy on the other hand is the enforcement of user preferences before the data is

exposed. In the case of SNS usage there are several examples conceivable, e.g. the

determination of an audience before a posting is created and sent by the user, or the negotiation

4 SNS Privacy and Regulation 97

and determination of the in general preferred data gathering and processing rights before the

SNS account is created by the user.

Considering the GDPR, the analysis shows that main parts of the previously identified privacy

relevant legislation articles represent ex post privacy provisions. Merely articles 9, 12, and 13

are possible ex ante policies. Art.9 is obvious, because it contains a long list of data types which,

in general subject to exceptions, shall not be processed from the outset. This clearly is an ex

ante provision. However, articles 12 and 13 handle the information obligations of the data

controller respective the SNO. In Art.13.1 the GDPR states that “the controller shall, at the time

when personal data are obtained, provide the data subject with all of the following information:

[…]”. This leaves a margin for the SNO to inform the user either upfront or at the time the data

transfer is happening. The latter would made the articles also an ex post legislation, because the

user would only have the opportunity to withdraw her consent or object the processing in

retrospect (i.e. after the data is obtained by the SNO). Further ex post provisions are: Art.14,

the duty to inform the data subject (the user) about the source of obtained PD about her; Ar.15,

the “right of access” by the data subject; Art.16, the “right to rectification”; Art.17, the “right

to erasure”; Art.18, the “right to restriction of processing”; Art.19, the notification obligation

regarding rectification or erasure of PD or restriction of processing; Art.20, the “right to data

portability”; Art.21, the “right to object”. Also the trustworthiness and accountability related

articles have predominantly ex post character, except for the articles 42 and 43 which handle

the possible establishment of data protection certification mechanisms. Those certificates could

be interpreted ex ante by users as signal for privacy respecting services and, thus, increase their

transparency. In summary, only the transparency related articles contain an ex ante character,

all other privacy related provisions are only enforceable ex post.

The findings are summarized in the following table, where afore explained different dimensions

of privacy are built in and create respective privacy sectors (see Table 12). In order to symbolize

to what degree a privacy sector of the multidimensional framework is covered by the GDPR it

is distinguished between four different scores of coverage:

98 4 SNS Privacy and Regulation

The best score symbolized by “” represents that the sector is fully covered by the GDPR: The

relevant data types are defined in the legislation and it also provides articles concerning the

privacy classification of the privacy framework.

The second-best score symbolized by “●” states that the GDPR largely covers this privacy

sector: The legislation contains articles relevant to the privacy type and, again provides articles

concerning the respective privacy classification.

The second-lowest score symbolized by “○” describes that the GDPR partly covers the

respective privacy sector: The legislation mentions in the recitals that it is targeting to protect

the privacy type and provides articles covering the privacy classification.

The lowest score symbolized by “×” states that the GDPR fails to cover the privacy sector and

does neither mention the privacy type nor the classification. For completeness, it was

distinguished for all sectors not only between privacy types and classification, but also

between ex ante and ex post privacy as criterion for exclusion.

Three major conditions

for privacy protection:

Aw

are

nes

s

Contr

ol

Tru

stw

ort

hin

ess

Typ

es o

f P

riva

cy

Privacy

classifications:

Poin

t in

Tim

e

Use

r aw

aren

ess

Tra

nsp

aren

cy

Use

r en

forc

emen

t

Dat

a so

ver

eignty

Monit

ori

ng

Acc

ounta

bil

ity

1. Person Ex ante ○ × ×

Ex post ○

2. Behaviour and Action Ex ante ○ ● × ×

Ex post ○ ● ● ● ● ●

3. Personal Communication Ex ante ○ ○ × ×

Ex post ○ ○ ○ ○ ○ ○

4. Data and Image Ex ante ○ × ×

Ex post ○

5. Thoughts and Feelings Ex ante ○ ● × ×

Ex post ○ ● ● ● ● ●

6. Location and Space Ex ante ○ × ×

Ex post ○

7. Association Ex ante ○ ● × ×

Ex post ○ ● ● ● ● ●

Table 12. Privacy Sectors covered by the GDPR, distinguished to four different Scores.

4 SNS Privacy and Regulation 99

4.2.6 Legal and Economic Capability

In the following subchapters the legally binding nature of the GDPR is analysed. The

practicability and permanency of the legislation is evaluated based on the corresponding

research and literature. The economic incentives and impacts of the GDPR are then examined

with a basic model and observations from recent market development.

4.2.6.1 Legal Analysis: Solid Basis with Room for Improvement

“Beginning of a new era of data protection law”[224], “milestone of European and global data

protection”[87] – the GDPR has definitely been perceived as a success by many [225]. From a

legal point of view, it was an achievement, but open questions and problems remain.

4.2.6.1.1 Major Achievements and Improvements

The GDPR’s success lies in the strengthening of data subject’s rights and an increase in the

documentation requirements and burden of proof of the controller. The substantial rights of the

data subject include inter alia the right to be forgotten [Art.17], data portability rights [Art.20]

and the “right to object to automated decision making”, including profiling [Art.21].

Furthermore, the GDPR imposes clear conditions for the data subject’s consent in data

processing. Articles 5.2 and 24.1 introduce the fundamental principle of accountability of the

controller with far-reaching documentation requirements. The controller also faces extended

information obligations towards the data subject. For data processors, the GDPR introduces

extensive obligations (including potential liability) in Art.28. In addition, the supervisory

authorities have been granted new and extensive powers to ensure compliance of the

controller’s processing activities with the Regulation. Finally, the GDPR enhances the

enforcement of the data subject’s rights via potential claims for damages (including immaterial

damages) and the imposition of significant fines. Another new major achievement of the GDPR

is its applicability to controllers and processors established in non-EU-countries when an EU

resident’s PD is processed in connection with goods and services offered to her or the behaviour

of individuals within the EU is “monitored”. The specific provisions on conditions applicable

to child's consent are also to be welcomed [Art.8].

100 4 SNS Privacy and Regulation

4.2.6.1.2 The Regulation’s Weaknesses

So far, data protection and privacy law within the EU have been characterized by directives.

Unlike a regulation, a directive is not directly applicable and not directly binding in the member

states. Due to the fact that the binding effect of a directive refers only to its objectives, the

member states retain a margin of maneuver when implementing the directive. This led to the

differences in data protection law across the member states. Therefore, choosing the form of a

regulation, the EU aims at achieving a harmonized data protection and privacy law within the

EU. By replacing the directive with the regulation, the legislator indeed contributes to the

harmonization to a great extent [87]. However, in a number of areas the regulation still leaves

much space for national derogations of the Member States [226]. This resulted in the

denomination of the GDPR as a “hybrid” between regulation and directive [227]. National data

protection law remains applicable with regard to the public sector (cf. Art.23; e.g. national and

public security, national defense, the independence of the judiciary). In addition, the Member

States have discretion to be allowed to derogate from or specify the GDPR’s provisions in many

areas, e.g. regarding media and employee data protection as well as the alignment with the right

to access information and regarding important objectives of general public interest [87, 228].

In two areas, Member States retain the right to change the standard of protection: First, they

can exclude the possibility to consent to the processing of special categories of PD [Ar.9.2.a].

Second, Member States may maintain or introduce further conditions, including limitations,

with regard to the processing of genetic data, biometric data or data concerning health

[Art.9.4][224]. Accordingly, there will be no complete harmonization of data protection and

privacy law and it remains disputable whether the GDPR really represents the general,

foundational rule it intends to be [227].

Apart from the potential non-uniformity due to the flexibility clauses, the GDPR also provides

for numerous undetermined legal notions which could soften the real impact of its provisions.

Regarding, for example, the right to be forgotten, the controller is allowed to “take account of

available technology and the cost of implementation” and is only obliged to take “reasonable”

steps to inform controllers which are processing the PD that the data subject has requested the

erasure by such controllers [Art.17.2]. According to Art.19, the controller shall communicate

any rectification or erasure of PD or restriction of processing to each recipient to whom the PD

have been disclosed, “unless this proves impossible or involves disproportionate effort”.

4 SNS Privacy and Regulation 101

Undetermined legal notions are a necessary tool in any piece of legislation. However, the extent

to which they are used can be considered critically [229].

The principle of purpose limitation, as described above, is contained in Art.5.1.b, according to

which PD shall be “collected for specified, explicit and legitimate purposes and not further

processed in a manner that is incompatible with those purposes”. However, the principle also

includes that “further processing for archiving purposes in the public interest, scientific or

historical research purposes or statistical purposes shall, in accordance with Article 89(1), not

be considered to be incompatible with the initial purposes” [Art.5.1.b]. Compared to the old

Data Protection Directive, the GDPR explicitly introduced the possibility to process data for a

purpose other than that for which the PD have been collected [230]. This is also allowed

according to Art.6.4. The wording of this principle has been much-debated and has been

amended several times. Art.6.4 refers to the viewpoint of the controller and obliges it, where

the processing is carried out for a purpose other than for which the PD have been collected and

which is not based on the data subject’s consent, to take into account certain aspects. These

include “any link between the purposes for which the PD have been collected and the purposes

of the intended further processing” [Art.6.4.a], the context in which the PD have been collected

[Art.6.4.b], the nature of the PD [Art.6.4.c], the possible consequences [Art.6.4.d], and the

existence of appropriate safeguards [Art.6.4.e].

The effectiveness of a legal regime is being ensured by a system of sanctions in case of

infringements of the rules. Chapter VIII of the GDPR addresses “remedies, liability and

penalties” in order to “ensure effective protection of personal data throughout the Union”

[Rec.11]. The question is: Are the sanctions provided for in the GDPR sufficiently deterrent?

Do they reflect the legislator’s aim to stress the importance of data protection in the 21st

century? Under current German law, infringements of data protection and privacy law can be

fined up to € 300,000 per infringement.22 Until today, German data protection authorities

imposed fines amounting to not more than € 2 million against individual undertakings [231].

The GDPR foresees that authorities can impose administrative fines up to € 20 million, or in

the case of an undertaking, up to 4% of the total worldwide annual turnover of the preceding

financial year, whichever is higher [Art.83.5]. The level of fines has been significantly raised –

which will ensure that many companies will accord data protection and privacy issues a much

22 See Section 43(2), (3) sentence 2 Bundesdatenschutzgesetz („BDSG“– German Federal Data Protection Act).

102 4 SNS Privacy and Regulation

higher priority. The GDPR adopts the concept of an undertaking as elaborated by the European

Court in EU antitrust law [Rec.150]. This means that the fine of up to 4% of the total worldwide

annual turnover of the preceding financial year refers to the “single economic entity” and

accordingly to the group of undertakings. A single economic entity can exist of different natural

or legal persons (e.g. parent company and subsidiaries) and is given when the subsidiary,

although having a separate legal personality, does not decide independently upon its own

conduct on the market, but carries out, in all material respects, the instructions given to it by

the parent company.23 When the legislator already referred to EU antitrust law, one can however

question why it did not also transfer its maximum amount of a fine to the GDPR. Under EU

antitrust law, an economic entity can be fined up to 10% of the total worldwide annual turnover

of the preceding financial year. This suggests that data protection and privacy law are not

considered as important as EU antitrust law.

Following the adoption of the GDPR, the European Commission continued the modernization

of the data protection framework by discussing a reform of the ePrivacy Directive.24 On 10

January 2017, a draft regulation was adopted. The Regulation shall replace the old ePrivacy

Directive and amend the conditions for the data subject’s consent in the setting of cookies as

well as opt-out options. The ePrivacy Regulations specifically targets the protection of the

private life and the protection of PD in the electronic communications sector.

It will be up to the European courts to clarify open questions in the future in order to enable a

uniform application of the GDPR in all Member States. To sum it up, the GDPR has been a

achievement with regard to privacy issues. However, it will be of vital importance to closely

monitor future developments and to react quickly when changes to the law will prove necessary.

23 Settled case-law of the court of Justice of the European Union, see for example case C-48/69 – ICI, recitals

132 et seq.; case C‑97/08 P – Akzo Nobel, recital 58.

24 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the

processing of PD and the protection of privacy in the electronic communications sector (Directive on privacy

and electronic communications).

4 SNS Privacy and Regulation 103

4.2.6.2 Economic Discussion

The economic discussion is separated into two different parts. The first part discusses whether

the incentives and fines of the GDPR will hold the SNOs accountable. The second part explains

which consequences the legislation may have on the SNS market and how this might affect user

privacy.

4.2.6.2.1 SNO incentives to stay GDPR compliant

Besides the privacy coverage of the GDPR examined above, the question of this section is about

whether SNOs will obey to the GDPR requirements or not. Therefore, the crucial factors are

the consequences of non-compliance in articles 82 and 83, the right to compensation and

liability for affected customers, and the general conditions for imposing administrative fines.

Under the assumption of rational acting SNOs, the dominant business strategy would be to

calculate the possibility of exposure of respective violations combined with possible fines in

the case of successful lawsuits against the company. If the combination of exposure possibility,

fines, and revenue with infringement is higher than the revenue under law abidance some

companies may decide to violate the law (see e.g. EU vs. Apple regarding USB chargers [232]).

Expressing this in a formula for clarity, 𝜋𝐿 > 𝜋𝐼 − (𝜌 ∗ 𝐹) is a condition for the SNO’s to obey

the GDPR; 𝜋𝐿 is the SNO’s profit under law abidance and 𝜋𝐼 for the SNO’s profit under

infringement of the GDPR, 𝜌 with 0 ≤ 𝜌 ≤ 1 expresses the probability of being convicted for

the GDPR violation, and 𝐹 with 𝐹 ≥ 0 stands for the respective fine and/or compensations to

pay. The fines and the provisions in case of illegal behaviour must be high enough to make

GDPR violating business strategies unattractive. For simplicity, any reputation damage from

illegal behaviour is not included in the formula. This simplification is aggravated by the fact

that studies have revealed that privacy scandals in SNSs had no sustainable effect on users’

behaviour and usage of SNSs [132, 134].

Although one can only speculate about a possible positive margin 𝜋𝐼 − 𝜋𝐿, the corresponding

articles 21 and 77–79 as well as several recent whistle-blower cases give the impression that 𝜌

is closer to 1 than to 0. There are two different factors for the number of 𝐹: compensations and

fees. Respective lawsuits from users affected by SNO violations “shall be brought before the

courts competent under the law of the Member State” according to Art.82. Therefore, a possible

monetary compensation depends on the law and the court practices of one of the 28 EU member

104 4 SNS Privacy and Regulation

states, which goes beyond the scope of this discussion. However, the administrative fines are

clearly defined in Art.83.4–6.

There are to different cases to distinguish: an administrative fine of up to 10,000,000 € or up to

2% of the worldwide annual turnover of the preceding financial year of the guilty company,

whichever is higher [Art.83.4], and an administrative fine of up to 20,000,000 € or up to 4% of

the worldwide annual turnover of the preceding financial year of the guilty company, whichever

is higher [Art.83.3, 5]. Thus, the current largest SNO, FB, would have faced a maximal fine of

$553 million for the year of 2016 in the first case, and it would have lost slightly more than 5%

of its revenue. For the second case, FB would have been fined $1.106 million which equals

roughly 11% of its respective net income (calculated from [233]). Considering that these fines

must be multiplied with a 𝜌 lower than 1, and under the condition of rational acting, FB has the

incentive to infringe the GDPR if 𝜋𝐼 ≥ 1,11 ∗ 𝜋𝐿.

This equation is of interest when combined with the finding of Rodger that 30% of the

companies affected by the GDPR will expect a rise of more than 10% in their budget to comply

with the legislation [234]. In other words, if large SNOs such as FB have the choice between a

10% budget raise to stay compliant with the GDPR or risking a fine for infringement they will

probably choose the second option under the described assumptions. However, small SNOs and

start-ups with low budgets could face bankruptcy if targeted with a maximal fine of 10,000,000

€ and 20,000,000 €, respectively.

4.2.6.2.2 GDPR Impacts on the SNS Market

To discuss the consequences of the GDPR for the SNS market and user privacy, the GDPR

Art.37, the designation of the (DPO, must be considered in addition to Art.83.3–5). The

increased duties for the SNO will increase the costs of operating and launching SNSs through

additional recruitment (e.g. DPOs) and investments in new technology [234–236]. The

increased costs of starting and maintaining an SNS increase financial market barriers and

strengthen the market positions for big players such as FB which can adapt more easily. In

addition, it will be harder for newcomers to set up an SNS and compete on the market. However,

start-ups have the chance to build an SNS from scratch and integrate the PBD approach, which

might counteract the financial market barriers to some degree.

4 SNS Privacy and Regulation 105

Another notable effect is that fines “have the effect of providing relatively strong incentives to

meet the specified minimum level of service quality, but provide no incentive for the utility to

outperform the minimum standard” [237]. Translated to the SNS case, this entails that the SNS

privacy level cannot be expected to rise beyond the GDPR obligations. This effect is increased

by the lack of competition for privacy due to the SNS market structures and the reluctance of

SNS users to pay for privacy (cf. chapter 5 & [90]). As the analysis shows, the GDPR fails to

address any SNO arrangements to raise user awareness for privacy. Therefore, privacy and will

not become a competitive factor for the SNS market through the legislation, but will be a

financial and effort burden for SNOs. Furthermore, the legislation does not change the SNS

market dynamics leading towards a monopoly situation (cf. chapter 2). The sole factor working

against these market dynamics is Art.20, the “right to data portability”, which could facilitate

the change from one SNS to another and may increase competition but not necessarily

competition for user privacy.

4.2.7 Technical Feasibility

This subchapter examines the technical feasibility of the previously analysed GDPR provisions,

and the three major conditions of user privacy are used again as a division. The necessary

legislation requirements for awareness and transparency are presented in subchapter 4.2.3.1.

The provision of relevant information in a transparent and comprehensible manner according

to articles 14 and 15 are a matter of clear and unambiguous language and symbol design [108,

238]. Transparency is no insoluble task for SNOs, although many negative examples currently

exist (cf. [108]). Possible solutions for a clear and unambiguous privacy policy design were

provided by Kelley and Breese in 2009 [239]. Furthermore, the SNO’s information duties can

easily be solved through email or SNS-internal personal message. As all current major SNSs

require an email-address to register and provide internal, personal messaging, both ways are

feasible. Finally, the TETs needed to comply with EU GDPR Art.15 are available and

implemented by some online services (e.g. Google Dashboard, cf. [164] & subchapter 3.2.4).

The GDPR’s requirements concerning privacy awareness and transparency are not a technical

obstacle for SNOs.

The situation is different regarding the given requirements for user enforcement and control

(see subchapter 4.2.3.2). While the some SNS user’s rights, such as the “right to access”

[Art.15], the “right to data portability” [Art.20], and the “right to object” [Art.21] are from a

technical viewpoint feasible, other obligations for the SNOs contain serious obstacles. This can

106 4 SNS Privacy and Regulation

be illustrated by the case of the Syrian refugee Anas Modamani who uploaded a self-taken

photo of the German chancellor Merkel and him to FB [195]. The photo was later re-uploaded

and misused by several right-wing activists within FB, falsely claiming that the Syrian refuge

was a terrorist. According to Art.16, Modamani has a “right to rectification” for this false

information. However, in the current state FB cannot pro-actively identify and correct this false

information if it is not reported by other users to the company [195].

As stated by Weber and Koops, similar problems apply to the “right to erasure” [Art.17][188,

240]. Again, the case of Modamani is a fitting example. With the current technological

capabilities of FB, the refugee was able to delete the photo with chancellor Merkel from his

own account, but neither he nor FB could prevent further spread of the picture within and

outside of the SNS [195]. Due to several online news reports about the case, copies of the photo

can be found in various sources within and outside the network and countless links are

identifiable via different online search engines leading to those copies. According to Art.17.2,

FB is now obligated to inform the copies controllers “that the data subject has requested the

erasure by such controllers of any links to, or copy or replication of, those personal data.” The

resulting hunt for links to and copies of Mr. Modamani’s picture on the internet “might become

an endless one” [188]. Recapitulated, the user rights that are preformed within the SNS with

the uploaded PD are feasible (articles 15, 18, 20 & 21), the rights which outreach the borders

of the SNS own ecosystem (article 17 & 19) face serious technical hurdles.

The GDPR requirements regarding monitoring and accountability are summarised in

subchapter 0 and are succinctly discussed. The legislation grants the SNS users strong rights

with articles 77 to 80. The privileges from those articles include the “right to lodge a complaint”

with the supervisory authority and to mandate a non-profit organisation (NPO) to

representatively lodge a complaint. However, all these rights depend on the technical powers

and abilities of the according national supervisory authorities to investigate complaints, as

defined in articles 51 to 59. These powers are given by Art.58.1 and the technical capabilities

are limited by the available state-of-the-art technology and the corresponding national

authorities’ funding. With 28 EU member states and at least 28 different supervisory authorities,

there are too many unknown variables to provide reliable outlook on these capabilities.

However, the case of Germany’s data protection agency ordering FB to stop collecting

WhatsApp user data from German users provides a vivid example of the difficulties related to

a national agency dealing with a globalised company [241]. Currently, the German data

4 SNS Privacy and Regulation 107

protection agency has no sufficient controls to monitor whether FB follows their rules or not.

Moreover, FBs computer centres are distributed all over the world and while the centres in

Germany and within the EU may have stopped collecting the Germans WhatsApp user data,

the computer centres in the USA or India may still proceed unobserved.

The crucial problems with the GDPRs technical feasibility are in the fields of user enforcement

and control. The “right to erasure”, also called the right to be forgotten, and the “right to

rectification” (articles 17 & 18) contain serious technical obstacles, especially if the data is

copied and then disseminated by third parties (cf. [188]). Furthermore, to monitor whether

SNOs follow the legislation’s requirements, the correspondent state authorities must be

equipped with sufficient staff, technology, and the rights to keep up with global acting

companies, such as the current major SNSs.

4.3 Conclusion

This chapter analyses the GDPR and the European approach to privacy. Subchapter 4.1 presents

a multidimensional privacy framework constituting the state-of-the-art privacy research results

regarding SNSs. The framework includes the three dimensions of privacy condition, privacy

type and time as an approach to evaluating privacy legislations, privacy policies, and privacy

measures for SNSs. The framework is then used to evaluate the European GDPR in subchapter

4.2.

The EU GDPR provides sufficient protection regulation for some aspects of user privacy in

SNSs but does not cover all dimensions of the multidimensional privacy framework developed

in subchapter 4.1. The ex ante dimensions of user privacy are mostly untended by the current

state of the legislation (cf. Table 12). Moreover, due to governance through regulations and

fines, the EU GDPR provides the incentive for SNOs to deliver the minimum level of required

privacy, but does not enforce privacy as a competition factor in SNS markets or break the market

dynamics towards privacy dismantling and monopoly conditions. This situation is aggravated

by the fact that the “right for data erasure” and its archetype, the right to be forgotten, are

impossible to implement on the SNOs’ side (cf. subchapter 4.2.7). Thus, the legislation does

not provide a sustainable solution to the problems of IA and MH or to the principal-agent

dilemma described in the chapters 2 and 3. SNOs still have the incentive to elicit as much PD

108 4 SNS Privacy and Regulation

as possible from their users to increase their competitiveness on the market’s advertisement

side despite the enforcement of the GDPR (see also chapter 5).

Furthermore, as the analysis in subchapter 4.2 illustrates, the EU GDPR contains no obligation

for SNOs to increase the privacy awareness of their users. Only the supervisory authority

provided by the correspondent EU member state must “promote public awareness” and might

not reach the users within SNSs during their decision-making process [Art.57.1.b]. However,

the European regulation can reduce the IA between SNS users and the SNOs, presupposing the

users care for their privacy and are aware of their rights. In addition, the clear regulations

defining which data can be processed for what purposes restricts the PD gathering and analysis

by the SNOs to some extent. These restrictions in combination with the SNOs’ duties to enhance

transparency, secure data protection, and employ a DPO will reduce their revenues. An

illustration of the GDPR’s impacts on the SNS market structure is presented in Figure 14.

Figure 14. GDPR's Influence on the SNS Market Structure.

109

5 Social Network Services: Competition and Privacy

In accordance with the results of the previous chapters, this chapter addresses RQ4 about

whether competition in the SNS market can enhance user privacy (RQ4.a) and which

interventions may direct the market dynamics to make user privacy a competition factor

(RQ4.b). As shown above, SNS business models highly depend on the gathering and analysis

of PD to obtain an advantage in competition for advertising clients (cf. chapters 1), though the

extensive collection and analysis of this data poses a threat to users’ privacy. From an economic

perspective, it seems rational for SNOs to ignore the users’ desire for privacy (cf. chapter 2).

However, privacy-friendly services might have the potential to earn users’ trust, leading to an

increased revelation of PD (cf. chapter 2 and [105]). 25

Addressing these issues, the existing privacy problem with SNSs in the context of competition

between SNOs is examined in the first part of this chapter to investigate RQ4.a about whether

competition tend to enhance user privacy or if it is the root of its violation. Therefore,

subchapter 5.1 takes a purely economic perspective on the present privacy problem in SNSs.

For this purpose it investigates privacy in the SNS business, focusing on the market structure

and its dynamics, taking into account that SNS constitutes MSP [44]. Analysing the user and

the advertiser side of SNS, their competitiveness and its influence on user privacy are compared.

Therefore, insights from theoretical literature concerning MSP, behavioural research papers

about SNSs and privacy, and market evidence are consulted.

Subchapter 5.2 then pursues RQ4.b and examines several popular SNS market interventions

with regard to their ability to facilitate user privacy. Therefore, the insights and the model of

subchapter 5.1 are used to analyse whether the corresponding interventions have the potential

to change the market dynamics towards privacy-friendly competition with user privacy as a

competition factor. Further ideas are discussed and the results of both subchapters are then

summarised.

25 This chapter includes and extends papers [31, 33].

110 5 Social Network Services: Competition and Privacy

5.1 SNS Market Competition and its Influence on Privacy

This subchapter is structured as follows: a brief overview of the related literature is presented

in the paragraphs below. The examination starts with an analysis of the influence of SNS

competition on user privacy by examining the characteristics of the goods which are up for

rivalry on the different market sides in the SNS environment, and by investigating those goods

and competition while considering appropriate findings from scientific literature. Furthermore,

their impact on user privacy is analysed, and the results are discussed and matched with

empirical evidence.

User privacy in SNS is a widely examined and discussed issue. The topic can be divided into

two main research streams: behavioural and user-focused research and SNO-focused research.

The former offers the seemingly contradictive result that users care for privacy and try to

preserve it with PSB [72], but do not act in a privacy-aware manner and carelessly disclose PD

when using internet services [106]. These findings led to the definition of the privacy paradox

[70]. Furthermore, SNO-focused research has shown that privacy is not a major market factor

in the competition for user attraction, although users consider it of high importance to them [17,

186].

However, it is still open to research wherefrom the dynamism arises which drives SNO to claim

more and more PD and thereby restrict user privacy. Different forces interact which each other

and with the participants in an SNS, and most of those forces have direct or indirect influences

on privacy [30]. One popular assumption is that users’ demand for privacy is a minor priority

for SNS providers because users are not willing to pay for it and the monetary income is

generated by advertisement customers [39, 75]. Recent successful mail services show that few

users are willing to spend small monetary amounts for increased communication privacy.26

Other companies in the internet search business even display the possibility to succeed without

demanding any money for a privacy respecting service [242]. Nevertheless, this willingness to

pay for privacy either in a monetary way or in terms of switching costs seems insignificant for

SNSs [77, 106].

The more general question of competition in MSPs has been addressed from different angles.

The economics of TSMs have most notably been explored by Rochet and Tirole [99], and

26 Mailbox.org, Posteo.de and others.

5 Social Network Services: Competition and Privacy 111

further research into MSP market structures has been conducted by Armstrong [100], Evans

and Schmalensee [94], and others [44, 50]. While these highly recognized studies provide deep

insight into the economics of MSPs, they do not cover privacy issues. A variety of publications

address the questions of competition and monopolistic tendencies in online MSPs, while

focusing on the search engine market and Google’s market position in particular [243, 244], or

on SNS and internet services in general [81]. However, the potential interrelation of the privacy

problems in SNSs and the market structure are not the focus of current research.

5.1.1 Economic Analysis: Requirements and Model

In the following section, the MSP business of SNSs is introduced and clarified, and the traded

goods in an SNS environment are examined from different angles to determine their

competitive character. Their influence on user privacy is then considered for both sides of the

SNS/MSP entity (i.e. the user and the advertiser side). For simplicity, other SNSs participants

such as application developers are bypassed.

As argued above, SNSs in general constitute MSPs, generating revenue by brokering targeted

advertising to its users for business partners [39, 42, 101]. A closer view on the market structure

reveals strong, direct same-side network effects between its users, because each additional user

makes the SNS more attractive to others [67, 69]. Moreover, there are indirect cross-side

network effects between users and advertisers. Each additional user makes the network more

valuable for advertising clients. This is due to a broader audience for targeted advertisement

and to a higher amount of user data, which elicits the possibility of drawing inferences and

creates more precise profiles [44]. On the other side, users are accepting personalized

advertisements as a price to use SNSs free of monetary charge [101]. However, there are no

positive network effects between the different advertisers, and the opposite can be assumed.

While advertisers profit from additional SNS users and often perform side advertisement by

promoting their SNS company profiles (e.g. advertising the company’s FB or Twitter-site), they

are rivals to other advertisers within the same SNS for the limited targeted advertisement space

and the limited user time and attention (see Fig. 2).

112 5 Social Network Services: Competition and Privacy

5.1.1.1 Features of SNS Goods

The currently leading SNSs auction targeted advertisement for specific audiences or keywords

in a real-time bidding system between interested advertisers. The space for advertisement in the

network is limited due to users’ limited time, and advertising clients can exclude each other

through a higher bid for the same keyword or target group. Thus, the good of advertisement is

an exclusive and rival good. Furthermore, an SNO can decide to exclude some advertisers from

its service, and advertisement in SNSs is a classic private good (see Table 13). Thus, one must

assume that there is a strong rivalry between similar advertisers. Classifying this insight into a

feature of goods table shows that the service of providing targeted advertisement within SNSs

is considered a private good [83].

Categorizing the SNS users-side is more complicated. First, two different goods are

distinguished: the plain SNS membership and the actual use of the network. The first usually

only requires a valid email address and approximately two minutes of filling out the application

form and confirming one’s mail address. The second comprises homing costs, time, and effort

to understand the SNS’ practice and add user created content, and both add up to the switching

costs (c.f. subchapter 1.1.2). It has been argued that users experience positive direct network

effects from other users. Profile creating and SNS usage is a non-rival, as creating a profile and

actively participating in an SNS does not prevent anyone else from joining or using it.27 In

addition, it is unclear whether these user-sided goods are excludable, which is the difference

between a public good and a club good. At first glance, it seems intuitive to argue that these

goods are public because no one seems to exclude anyone else from the usage. However, people

from Turkey trying to access FB during the military coup in July 2016 or people trying to use

FB in China or North Korea will disagree. Countries and SNOs have technical instruments to

restrict SNS access or directly ban specific users.28 Thus, the SNS registration and usage is non-

rival but is excludable for users’ side when it complies with the characteristics of a club good

(see Table 13).

27 Except for server overload, which is not discussed here for simplicity.

28 Technical workarounds for users are neglectable for this analysis.

5 Social Network Services: Competition and Privacy 113

excludable non-excludable

rival “private good”

targeted advertisement

“common good”

non-

rival

“club good”

SNS registration & usage

“public good”

Table 13. Feature of Goods Classification for SNS.

5.1.1.2 The SNO Viewpoint

The last angle is the SNO perspective, as the providers compete among themselves for

advertising clients. All SNSs offer approximately the same product on this market side: targeted

advertisement. It applies here that can money spent by an advertiser in one network cannot be

spent twice, and advertisement space cannot be assigned multiple times. One can assume that

the SNS advertisement market side is in strong competition because several providers supply a

comparable private good to a high quantity of advertisement-willing companies (see Figure 15).

In addition, user registration and membership seem to be non-rivals from an SNO perspective

because users can easily set up multiple accounts in different SNSs. Moreover, a provider

cannot prohibit its members from registering at other networks or hinder other services from

opening their registration to them. Yet, users can decide to refuse a certain SNS. However,

competition undoubtedly exists for attracting users between SNOs, since the quantity of

accounts is a signal to attract advertisers.

Third and most interesting from an SNO perspective is the time that users spend in the network.

It seems to be a highly valuable good for providers, because it increases the possible quantity

of advertisements shown to users and the amount of PD disclosed by them [72]. Furthermore,

users’ time and attention are limited and can only be spent once on an SNS. A strong competition

for users’ time between SNSs and other services can thus be presumed [102]. As a result, users

are not only paying for an SNS with revealed PD, but also with their time and attention (see

Figure 15).

114 5 Social Network Services: Competition and Privacy

Figure 15. The Market Structure of SNS.

5.1.1.3 Competition on the Users’ Side

Analysing the SNS’ competition on the users’ side of the market structure, two relevant factors

are identified first: trust and enjoyment [67, 69]. As described before, the SNO wants to receive

users’ time and attention for advertisements and their disclosed PD for improved targeting of

these advertisements (cf. subchapter 3.2). On the other hand, users want to enjoy an SNS and

demand that it be trustworthy [67, 69, 105], while enjoyment also includes strong and positive

direct network effects (i.e. finding friends as active members within the same SNS) [67].

Regarding the trust factor, the literature shows that increasing trust in an SNS can be achieved

for SNOs by implementing privacy controls [69, 113]. This is relevant because improved trust

increases the SNS usage, the quantity of disclosed PD, and the acceptance of advertising [103,

114]. Thus, trustworthiness is beneficial for SNOs to bind users to their SNSs and to receive

more UGC and reliable PD. Although the implementation of privacy controls has a positive

influence on user privacy in SNSs [30], findings suggest a design conflict between privacy and

usability and thus, enjoyment [17].

The enjoyment of an SNS seems to be the most crucial factor in SNO competition. First, it

attracts users to join an SNS and self-evidently increases positive direct network effects

between the users as well as positive indirect network effects from the users’ side to the

advertisers’. Second, enjoyment tempts users to spend more time within the SNS, leading again

5 Social Network Services: Competition and Privacy 115

to positive influence on advertisers. More time spent in an SNS also increases the quantity of

UGC because content creators value a platform more if they have a larger audience [110].

Furthermore, a higher amount of content also attracts more advertisers because it enhances the

providers’ targeting ability for advertisement. To conclude, plenty of motives for SNOs to

compete with enjoyment for user registrations and user time are found, and SNOs are facing

this competition by increasing their own platform stickiness. The literature results show that

there are two ways of doing this: increasing the content of the platform and implementing more

features and functionalities (as long as it has no significant negative influence on the SNS

usability) [44, 102]. While the latter seems privacy neutral for users, changing the platform

appearance to entice users to enter more PD and UGC is a threat to their privacy [72]. However,

additional platform features also contain the potential to harm user privacy when they deduce

additional PD from users or leak the data from the platform to third parties if the SNO decides

to open the SNS to external application developers.

5.1.1.4 Competition on the Advertisers’ Side

As stated above, SNOs compete to sell targeted advertising to corresponding customers (c.f.

subchapter 5.1.1.2). To attract these advertisers, four factors were identified as most relevant:

the quantity of users, the accuracy of user targeting to serve the advertisements, the time users

spend in the SNS, and the price to advertise to the targeted user group. In their evolution, the

majority of SNSs followed the same path: starting one-sided and attracting users, and then

hitting a critical mass, implementing advertisement, and evolving into a TSP [44]. Later, most

SNSs also included external application developers and other services (e.g. identity

management) and transformed into an MSP. The privacy impacts of competition for user

registration are covered in the previous section.

In addition to the quantity of users, the average time a user spends in the network appears to be

the crucial factor in the competition for advertisers [102]. Its theoretical competition and

privacy impacts on the user side are shown above. Another option for tying users closer and

longer to a network is to enhance the SNS content either happen by tempting users to post more

UGC or by including external content creators and their content directly into the network (e.g.

news sites or celebrities) [46, 117]. Another method is to acquire competitors and include their

services in the SNS. The possible privacy threats of tempting users to reveal more data are

shown above. The method of including further content form external content creators is

116 5 Social Network Services: Competition and Privacy

basically privacy neutral, except for users’ active reaction to this content (e.g. likes and

comments). However, implementing purchased competitors and emerging their existing PD and

accounts with already existing in-network user data can be privacy invasive. Merging that data

and drawing inferences from the new database can reveal information which the user initially

wanted to hide by audience-segmentation using two separate services.

The third identified factor is the accuracy of user targeting. The most obvious way of improving

this factor is to gather more PD, either directly from the users or from external sources, to

analyse it with algorithms. The impacts of PD accumulation have been discussed. One must

expect that the possibilities of enhancing targeting by PD reaches a limit value where gathering

more data does not result in further improvement. Thus, another method to improve targeting

constitutes a direct inquiry of users either for their interest or indirectly by giving them the

controls to correct their information and drawn inferences connected to their account (cf.

subchapter 3.2). The latter can be privacy enhancing for users if it allows them to delete data or

exclude certain information from the targeting mechanisms.

Finally, the cost-benefit-factor of advertising on an SNS depicts a crucial aspect, since the

advertisement side is the monetary paying side. As agents acting in an economic environment,

advertisers seek the most efficient way to spend their money and target their audience. In

addition to the developers’ achievement of a leading targeting algorithm and the topic of feeding

it with user PD, SNOs have another factor to influence the efficiency of their advertisement

offer: economics of scale. In addition to non-rivalry and excludability, user membership and

activity in SNSs have the characteristic of low marginal costs. The costs to provide the service

to an additional user are, after establishing a working system, marginal for the SNO. The same

is true for the advertisers’ side and the service of automated advertising space auctions, which

leads to rising returns of scale and makes an SNS more efficient as both sides become larger.

Thus, the direct way to increase SNS attractiveness for advertisers is to gain more users who

spend time in the SNS. This becomes even more efficient if these users originate from different

target groups, because the SNS then represents all sections of society. This insight leads directly

back to subchapter 5.1.1.3 and the discussed influence on privacy and competition.

Furthermore, the SNO perspective on competition factors and their aftermath is summarised in

Figure 16.

5 Social Network Services: Competition and Privacy 117

Figure 16. SNO Perspective on Competition.

5.1.1.5 The Trump Side of Competition

In the previous sections, the two major sides of SNSs regarding competition and their impacts

on user privacy were analysed. Table 14 summarizes the different activities of SNOs and their

influences. However, it seems uncertain which side of competition outweighs the other, and

whose needs are favoured by the SNO for economic reasons. If the user side and the users’

needs are preferred, the trust factor could lead to an improvement in users’ data control options

and to enhanced user privacy. The opposite is expected if advertisers are the SNO-favoured

SNS side, due to the demand of increasing user data for better profiling.

118 5 Social Network Services: Competition and Privacy

SNO Activity Influence on

User Privacy Implementing Privacy Controls +

Give Users Control to Correct and Enhance the Information and Drawn Inferences

Connected to their Account

+

Implement more Features and Functionalities ❍

Changing Platform Design to Entice User to Enter More Data -

Merging their Existing User Data and Accounts with Already Existing In-Network User Data -

Table 14. SNO Activities and Their Influence on User Privacy.

Following the influential papers on MSPs, the standard link from classical economics between

the inverse relation of price over marginal cost and elasticity of demand does not hold true for

MSPs. In other words, the service on one side will be served by the provider even if this service

and the price paid for it by the its participant alone is not profitable [94]. Thus, the loss from

one side must be outweighed by the profits from the other. To identify the provider’s cash cow,

one must identify which side MHG is most prevalent on [99, 100]. The side where the MSP

participants use more than one platform simultaneously is expected to be overpriced, while the

single homing side is expected to be subsidized [109].

5.1.2 Reality Check and Market Development

In this subchapter, the theoretical analysis of competition and its influence on privacy in SNS

are matched with available market observations and empirical evidence. Subsequently, it shows

whether this evidence confirms the theoretical assumptions.

5.1.2.1 Privacy Options and Controls

Targeting the trust factor mentioned in subchapter 5.1.1.3, current developments indicate that

SNSs and other internet services have recognized the coherency between users’ trust and UGC.

Respectively, FB introduced new privacy controls in 2008 [123] and constantly improves them

[127], while Google implemented its own user privacy controls lauded by specialized press

[122]. In contrast, regarding the FB privacy default options, scholars have found a decrease in

user privacy over the last decade [125, 126]. One reason for the trend of expanding privacy

options and controls in internet services might be the upcoming reform of the EU GDPR which

comes with rigid laws and harmful financial penalties in the case of violations [34, 245]. The

future of this trend is sustainably influenced by user behaviour and by the corresponding law.

Utterances of the European Commissioner assume that the end of the road of regulating internet

5 Social Network Services: Competition and Privacy 119

services concerning competition and user privacy hast not been reached [93]. Since these two

conflicting incentives for SNOs to implement privacy options and controls cannot be isolated

from another, it is impossible to say which one prevails.

5.1.2.2 SNS Competition for and MHG Behaviour of SNS Users

As showed in subchapters 5.1.1.3 and 5.1.1.4, SNOs compete against each other for users’ time

spent on their platform and for content. Therefore, they implement new features into their

platforms or integrate taken-over services and external apps. Recent developments in internet

services and SNSs make this competition visible. The acquisition of WhatsApp by FB, as well

as the takeover of Instagram led to a domination of FB in the branches of mobile messaging

and mobile photo sharing. Moreover, FB started partly merging FB and Instagram accounts and

announced in the latest terms and conditions change of WhatsApp that phone numbers and

contacts will be transferred to FB [246]. Both clearly contain the privacy threats of merging

different services accounts.

As a result, FB’s recent introduction of instant articles can be interpreted as an attempt to

enlarge their own content and as an attack on Google and Twitter [247]. Furthermore, FB

included the feature of selling tickets for events and recently announced the possibility to run

crowd-funding campaigns and collect money directly inside the SNS [248]. This seems to be

an attempt to use its large user base and the resulting positive network-effects to include the

markets for online ticket-sale and crowd-funding, and keep the users in the FB environment as

long as possible. The same applies to the implementation of an owned browser within the FB

app, a strategy also used by Twitter. Furthermore, FB recently started a video service similar to

Netflix within the SNS [129]. It appears that FB is developing from an SNS to a universal

internet entertainment MSP. This is also true for the MSP Google, which purchased the video-

SNS YouTube in 2006 and included various advertisement services in the following years. It

also offers services such as Google Mail, Calendar, Maps, Drive, Docs, Photos, Keep, Translate,

and the SNS Google+.

Moreover, companies such as Google, FB, and Twitter provide identity management features

which enable users to log-in to external services with their existing SNSs accounts. What seems

like a comfortable feature to make users’ life of managing different online accounts easier, can

also be interpreted as a way to gather more data from external services, track users beyond the

120 5 Social Network Services: Competition and Privacy

platform, and enhance targeting and time for displaying advertisements. The privacy threat

aspects of these strategies is explained above.

The most interesting question is which side, the users’ or the advertisers’, is subsidized and who

is the “cash cow” (cf. subchapter 5.1.1.5). Evidence from the magazine industry suggest that

users are subsidized and advertisers are the main income source [249, 250]. The fact that users

are not paying any monetary price for SNS and advertising is the main income source for SNOs

strongly supports this view [251]. Nevertheless, considering MHG as the crucial factor, recent

statistics suggest that it is on the rise on the user side with 52% of US users using two or more

social media (SM) sites in 2014 [61]. However, the relevant factor of SNS competition on the

user side is also the time spent in the network (cf. subchapter 5.1.1.2). FB leads by far with 70%

of its users using the platform daily, before Instagram with 49%, and Twitter with 36%, while

the second also belongs to the FB environment. Moreover, “the engagement of Facebook users

continues to grow, while daily use on other platforms shows little change” [61].

5.1.2.3 The Advertisement Market

Users show MHG behaviour especially regarding their SNSs membership, while their time

spent within SNSs has a strong tendency towards the FB conglomerate. The market distribution

is used as a first indicator of MHG behaviour on the advertisers’ side. In 2014, Alphabet’s share

of the net digital advertising revenue was 31%. This revenue includes the income from targeted

advertising on the Google search sites and the advertisement revenue from services such as

YouTube, Google+, and AdWords. The leader is followed by FB with a market share of nearly

8% and the Chinese online search engine Baidu with close to 5% [252]. Despite these distinct

numbers, the development of the online advertisement market indicates that the competition

between the targeted advertisement-offering SNOs is rising. While Alphabet kept its market

share over the last three years, competitors are catching up, as FB nearly doubled its market

share from 2012 to 2014 from 4% to approximately 8%. Except for two, all other market

participants beyond the 0.5% market share claimed slightly higher percentages each year [252].

This development indicates that online targeted advertisement is becoming more popular, and

presumes that adverting clients tend to use more than one internet services and show MHG

behaviour.

5 Social Network Services: Competition and Privacy 121

This assumption must be verified by the respective SM industrial report [253]. The survey

reveals that most of the interviewed SM markets (93%) use paid FB ads on a regular basis to

reach potential customers (cf. Figure 17). In addition, 24% use Instagram ads, which also belong

to the FB conglomerate. Far behind, 16% use LinkedIn ads, 15% use Twitter ads and only 11%

use YouTube ads. These results show that regarding SM advertisements, only a marketer

minority of 16% is MHG in addition to the use of ads within the FB conglomerate. However,

targeted advertisement outside SNSs is not considered in this report. Thus, targeted

advertisement in Google Search results, the Google display network, and Google AdWords are

not included, and the only representative for the Alphabet Corporation within the industry report

is the SNS YouTube. Nevertheless, the industry report gives an impression of the MHG

behaviour on the advertiser side and shows that MHG seems weaker than on the user side with

only 16% of the SM marketers running advertisements regularly outside the FB conglomerate.

In addition, the report reveals that 64% of the marketers plan to increase their use of FB ads

and 42% plan to increase their Instagram ads in the future, while only about 30% plan to

increase their YouTube, Twitter, or LinkedIn ad usage [253]. Similar to the users’ side, the

advertisers’ side shows a strong tendency towards FB.

Figure 17. Social Media Ads used regularly by Marketers According to [253].

Given that both sides of SNSs seem to show MHG behaviour and that the user side does not

pay for their usage in terms of money, it is difficult to apply the insights of MSP markets and

MHG here. One could claim that users are the subsidized side because they are enjoying SNSs

for free. However, one could also argue that advertisers are being favoured and subsidized

93%

24%

16%

15%

11%

3%

1%

0% 20% 40% 60% 80% 100%

FACEBOOK ADS

INSTAGRAM ADS

LINKEDIN ADS

TWITTER ADS

YOUTUBE ADS

PINTEREST ADS

SNAPCHAT ADS

122 5 Social Network Services: Competition and Privacy

because they are the crucial revenue source of SNOs. Moreover, advertisement prices in SNSs

seem comparably low to those paid in print media, and one expects the targeting to be more

exact due to the revealed user data [251]. Hence, advertisers could be subsidized with better

targeting for comparably lower prices while users could be overpriced in terms of PD elicited

by the SNSs and lower privacy. However, according to the comparable MHG behaviour of both

sides, one cannot assess this case.

In summary, the investigation shows that competition between SNOs does not necessarily

improve or harm user privacy. Competition in MSPs such as SNSs is complex and the analysis

shows that there are indeed various privacy harmful aspects (cf. chapter 2). However, the

competition for users has the potential for privacy-friendly consequences, but only if the trust

factor outweighs the privacy-contrary implementation of additional features and overtaken

services.

5.1.2.4 Limitations

The presented work is limited due to the available data. There is no evidence of whether newly

integrated privacy controls by FB and Google are actively used and increase user privacy or if

they are have a trust-building and possible privacy-harming effect. Moreover, the comparably

low costs of targeted advertisement in SNSs can partly be explained by the strong economics

of scale and the near-zero variable cost of running an SNS and displaying advertisement

compared to print media. Furthermore, considering the classification of SNS goods, club goods

such as the SNS membership and usage from the user side also have the characteristic of low

marginal costs. This leads to rising returns of scale, which makes club goods ideal for natural

monopolies [83]. Considering the costs of setting up an SNS by programming the service and

establishing the computing power to serve a broad user base, as well as the high costs of running

such a server infrastructure as fixed costs, there are monopoly market tendencies [254].

5 Social Network Services: Competition and Privacy 123

5.2 Enhancing Privacy Competition

As subchapter 4.2 demonstrated, the GDPR cannot enhance competition for user privacy

between SNOs. In addition, the legislation only provides the incentive for SNOs to deliver the

minimum level of required privacy measures due to the governance through regulations and

fines (cf. subchapter 4.2.6.2.2). Furthermore, subchapter 5.1 showed that competition between

SNOs at the current stage can harm user privacy rather than preserve it. The question RQ4.b is

about which provisions are suitable to make user privacy a competitive factor for the SNS

market and may lead to an increase in user privacy by market competition forces. Therefore,

the analytic results from the chapters before are used to evaluate promising attempts of the

current economic discussion in the following section, assuming rational acting business parties.

5.2.1 The Right to Be Forgotten

As shown afore in subchapter 4.2.3.2, the right to be forgotten was converted into the “right to

erasure” within the GDPR, including a long list of exceptional rules (cf. [216]). For simplicity

and to demonstrate the full capability of the concept, this subchapter considers the original idea

of the right to be forgotten. The notion stands for the privilege that every natural person has the

right to all types of data about her, as well as all links to that data that are deleted from any

online service, irrespective of the technical or financial expenditure, of ongoing processes, and

of the originator. The European intellectual background of this right is rooted in the French law,

which recognizes the “right to oblivion”. This right originally allowed a convicted criminal to

object to the publication of the facts of her conviction and incarceration, when she had served

her time and was rehabilitated [216, 255].

As demonstrated in subchapter 3.1, the disclosure of PD can be interpreted as a PM for the use

of SNSs by the user to the correspondent SNO. Thus, the subsequent erasure of this data or

parts of it can be considered payment withdrawal, which may include economic disadvantages

for the affected SNO. In addition to the costs of implementing sufficient tools to identify and

erase the correspondent data and links leading to it, the SNO may face disadvantages in BD

capabilities which endanger the SNO’s business model [256]. Consequently, the SNO has the

incentive to take actions to avoid user claims to their right to be forgotten. Following the

analysis of subchapter 2, the results of the advanced SD model (cf. subchapter 2.2.2), and the

results of subchapter 5.1.1.3, there are two rational paths for the SNO.

124 5 Social Network Services: Competition and Privacy

The first is to decrease the user awareness of potential privacy problems and their right to be

forgotten. In that case, the SNO would try to disguise the extent of disclosed, gathered, and

analysed PD as well as to whom it is visible or transferred (e.g. with misleading privacy

policies; cf. subchapter 3.1.5). This SNO behavioural pattern was frequently observed in the

past [108, 133, 134, 159, 238]. However, if the SNOs follow the correspondent passages of the

European legislation, they are obligated to provide transparent and comprehensible privacy

polices (GDPR, articles 12–14). Thus, disguise is no longer a legal option.

The SNO’s second rational path to react to a rigorous right to be forgotten is to increase the

users’ trust in the SNS, such that the users have less incentive to claim their right. As the results

of the previous chapters show, user’ trust in an SNS is best established by the SNO by providing

appropriate privacy options and data security, as well as trustworthy and comprehensible

privacy policies. All these measures are already required by the GDPR (cf. subchapter 4.2.2).

However, certain privacy applications are best suited to prevent users to choose the legal way

of data erasure and nudge them to take their PD’s fate into their own hands. The first is simply

by giving SNS users the option to transparently see and edit their revealed PD. This data editing

can happen easily through an appropriate interface (e.g. a privacy dashboard) [174, 175]. The

second option is to give users the ability to limit their data visibility in terms of audience and

durability. Limiting the audience of certain posts or pictures is called “audience segregation”

[138]. This feature allows the user to sort her friends or followers into groups and release data

specifically for them, and it is already implemented by the leading SNSs (e.g. FB) [123]. The

limitation of data durability describes that released data is marked with an expiration date and

automatically erased or made undiscoverable after its expiry. This feature was made popular by

the smartphone application Snapchat, where messages and stories were only available for 24

hours and later hidden from users [257]. However, FB experimented with a similar feature

called “slingshot” and later implemented transient stories as a feature in FB and Instagram [188,

258]. A positive side effect of these features for the SNOs is that the data stays within their

databases and can still be used for BG, even though it is no longer visible for the SNS users.

In summary, a consistent right to be forgotten for SNS users will lead to an implementation of

privacy features in SNSs to increase user trust. However, in accordance with the assumptions

and results of chapters 2–4, these features will only increase user privacy against other users

and third parties in terms of “audience segregation” and “data durability”. The SNOs ideally

stay in control of the PD and will continue to use it for BD; in return, SNS users will be granted

5 Social Network Services: Competition and Privacy 125

access to correct, deactivate, or delete their data. However, the right to be forgotten has no

impacts on direct and indirect network effects within a SNS. Users will still suffer lock-in

effects as well as switching costs and therefore may choose to stay in a SNS due to those effects

even if it lacks sufficient privacy features (cf. subchapter 5.1). Hence, privacy or more precisely

trust is strengthened as competitive factor but as long as users value enjoyment and staying in

contact with their network contacts over privacy, they will continue using their current SNS

even if it fails to provide the aforementioned privacy features [67, 69].

5.2.2 The Right to Data Portability

The “right to data portability” is also embedded in the GDPR and represents the user’s right to

receive all PD concerning her and which she has previously provided in a “commonly used and

machine-readable format”, including the right to transmit it to another service (GDPR, article

20; cf. subchapter 4.2.3.2). The following analysis was conducted under the assumption that

this provision will lead to a standard file format for PD in SNSs that will allow the smooth and

reliable downloading from one and the equally easy uploading to another SNS (cf. subchapter

5.2.3). It is expected that the current SNS market leader, FB, will set this standard with its given

file format by market power.

When SNS users can download all their previous uploaded PD and integrate it easily into a new

service, this represents a significant decrease in switching costs and a weakening of the lock-in

effect (cf. chapter 2 & [259]), which will make it easier for users to switch to another service

or start MHG and use another SNS in parallel. This decreases the effort for small SNSs and

market newcomers to gain new users and their PD, because data is no longer walled by the

leading services [260]. On the other side, the duty to develop and implement the option for data

portability to the SNS is a technological and monetarily costly burden for the SNO. Thus, the

“right to data portability” creates a market entry barrier for newcomers and endangers the SNS

business model, which might decrease consumer welfare for the users [261, 262].

While data becomes portable, the circle of friends for SNS users remains within a corresponding

SNS. Parts of the PD may be worthless without the corresponding user’s friends and audience.

Thus, data portability has no impact on an SNS’ direct network effects on the user side. It is

expected that the “right to data portability” will lead to an increase in MHG, but not necessarily

to a weakening of the market leader’s position. This leads to enhanced SNS competition for

user’s time and attention, which is expected to have a negative impact on user privacy rather

126 5 Social Network Services: Competition and Privacy

than a positive one (cf. subchapter 5.1). In addition, data portability threatens user privacy and

data security, because “when an individual’s lifetime of data must be exported ‘without

hindrance’, then one moment of identity fraud can turn into a lifetime breach of personal data”

[261]. The “right to data portability” will increase the competition between SNSs for users and

their time and attention, but not for their privacy as a competitive factor. The presented evidence

leads to the assumption that this right has a negative rather than positive influence on user

privacy.

5.2.3 The Concept of Interoperability

To be interoperable denotes that “the systems, procedures and culture of an organisation are

managed in such a way as to maximise opportunities for exchange and re-use of information,

whether internally or externally” [263]. Regarding IT, the referred concept is technical

interoperability, which is provided by the development and predefinition of open standards for

barrier-free communication, transport, storage, and representation [263, 264]. For true

interoperability, the open standard is not allowed to favour any provider or business, and must

be equally accessible and utilisable for all market participants [265]. Examples of technical

interoperability are the open and standardised internet protocols, which are the origin of the

internet’s spread and success [263, 266, 267]. Interoperability and its open standards are usually

developed and achieved by NGOs such as the Internet Engineering Task Force (IETF)29 or the

World Wide Web Consortium (W3C)30, which receive funding from several different sources

such as governments (e.g. ITEF), or associations of businesses, NGOs, universities,

governmental entities, and private persons (e.g. W3C). The development and implementation

of open standards for interoperability are pushed by collaborations of industry partners to

develop new opportunists and enlarge their distribution range (e.g. in the case of the W3C and

the further development of internet standards) or by governments to increase their citizens’

safety and their economic growth (e.g. USA Interstate Highway System Standards) [267].

Interoperability for SNSs would require an open standard which provides at least (1)

authentication, (2) relationship description, (3) communication, and (4) content sharing across

different SNSs, such that it is as easy to implement and as SNO-friendly as possible [268, 269].

29 https://www.ietf.org/ (last accessed 22.10.2017)

30 https://www.w3.org/ (last accessed 22.10.2017)

5 Social Network Services: Competition and Privacy 127

The open standard needs a global identifier for each SNS user (1), such that one can state that

a certain user of another SNS is her friend and a format to represent this connection (2). It

requires a standard communication protocol to send messages cross-border between users of

different SNSs, such as the SMTP standard which enables emails to be sent from one provider

to another (3). Finally, a standard to allow friends in other SNSs access to the own page and

postings and to keep unwanted visitors outside is also needed (4).

Several attempts for interoperability between SNSs already exist. For instance, the Extensible

Messaging and Presence Protocol (XMPP) provides requirements 1–3 and was originally

developed for instant messaging [270]. The Vodafone R&D works in lead of the OneSocialWeb

group on the enhancement of XMPP and includes vCard support as well as streams for

microblogging content sharing. However, OneSocialWeb lacks implementation references and

the project had its last activities in 2010 [269].31 OpenSocial is based on open web services

developed by Google and later transferred to the W3C. It provides the exchange of PD,

relationships, and text-based communication, and is a full working open standard for SNSs

[271]. However, it is not widely spread and is not supported by the market leader, FB, which

developed its own API for application integration and authentication services [269]. In addition,

various other attempts for an SNS interoperability standard exist or are in development [264,

268].

Currently, SNSs are completely closed systems which imprison their users and the

corresponding PD because they represent the key to their business model (cf. subchapter 5.1)

[268, 269, 272]. Economically, the wide-spread implementation of SNS interoperability would

change the role of networks from gatekeepers of social interaction and information to interfaces

of social content and communication. The opportunity to communicate and exchange content

from one user to another across the borders of SNSs would eliminate the direct network effects

between users of the same network. The necessity that friends must be members of the same

SNS to communicate and share content with them would no longer exist. Consequently, the

lock-in effect would cease when SNS interoperability is combined with data portability. With

the disappearance of the direct network effects as a competitive factor, the other factors would

gain importance, namely usability and trust (cf. chapter 2). As illustrated above, trust in an SNS

is assured by data security and comprehensible privacy policies, as well as effective privacy

31 https://github.com/onesocialweb (last accessed 22.10.2017)

128 5 Social Network Services: Competition and Privacy

options for its users [30]. Thus, user privacy would gain more importance under SNS

interoperability, although some findings suggest a conflict in design with usability [17].

Furthermore, SNS interoperability disrupts the MHG feedback-loop by dissolving the direct

network effects between users and making content non-exclusive for a certain platform (cf.

chapter 2). Consequently, monopoly tendencies in the SNS market would also be weakened.

On the user side, SNS interoperability would force SNSs in competition for usability and user

trust, as market leaders could no longer build on their large user base and the connected direct

network effects as a competitive advantage.

Furthermore, SNS interoperability would facilitate the social network activities for businesses

and celebrities, because they would no longer have to be present on all SNSs to stay in contact

with their customers and fans. One well-groomed site in their favourite SNS would be sufficient

if they could communicate and send their postings cross-border to their audience in other SNSs

[273]. With SNS interoperability and a standard API for all networks, application developers’

programming effort would be reduced and their application could compete for users in all SNSs.

In the cases described above, the network effects also dissolve. However, the user quantity in

an SNS and their time spent within the network would remain the key competitive factor for

advertisers to choose where to spend their budget for targeted advertisement. Thus, the indirect

network effects between the quantity of users and the advertisers within an SNS would last (cf.

chapter 2). With the disappearance of direct network and lock-in effects on the user side, it is

expected that the SNS competition for advertisers, and for users and their time and attention

within the network will increase under SNS interoperability.

From a data security viewpoint, SNS interoperability represents a mixed blessing. On one side,

it increases SNS competition for user trust, and for data security and privacy. Thus, it is expected

that the SNOs’ efforts in this area will increase. On the other side, when content and PD is

transferred cross-border between different SNSs this multiplies the possibilities of attacks to

catch this data, such as man-in-the-middle attacks. Thus, the data transfer and communication

between different SNSs under interoperability requires the integration of an open standard for

encryption (e.g. open PGP32 or HTTPS [274]). If such a data security standard is implemented

32 http://www.openpgp.org/ (last accessed 23.10.2017)

5 Social Network Services: Competition and Privacy 129

and maintained, it is expected that overall data security and user privacy will increase under

SNS interoperability.

As illustrated above, SNS interoperability comes in favour of many market participants, but it

will weaken the position of the market leader. According to theoretical findings and market

observations, interoperability will only be achieved by market forces if it is pushed by a

consortium of smaller SNOs which are benefiting from it. This consortium will succeed if its

combined market power is larger than the current market leader’s [269]. For instance, the

widely spread SNS interoperability standard OpenSocial is currently supported by the SNSs

MySpace, StudiVZ, and Xing, and are far away from endangering the market leader of FB

[271]. Another solution to enforce SNS interoperability would be by governmental regulation

(e.g. as a part of an updated GDPR), though the GDPR embedded “right to data portability”

might enforce general efforts for interoperability standards [275].

In summary, SNS interoperability breaks the direct network effects in consumption on the user

side and the lock-in effects of SNSs (see Figure 18). It increases the remaining competition

factors usability and trust for the SNOs, the latter consisting of user privacy and data security.

Furthermore, it breaks the MHG feedback-loop identified in chapter 2 and weakens the

monopoly tendencies of the SNS market. It is to expect that SNS interoperability has a positive

influence on user privacy as a competitive factor for the SNS market. The findings are

summarized below in Figure 18.

Figure 18. Interoperability: SNS Market Role and Impacts.

130 5 Social Network Services: Competition and Privacy

5.2.4 The Concept of Privacy Trust Banks

The notion of privacy trust banks (PTB) describes a digital service as an intermediary for PD

transaction between the user and an online service; it originates from Clarkes illustration of a

digital persona [276]. The PTB concept represents that internet users store their PD in a trusted

service, the PTB, which then shares this data as an intermediary with third services instructed

by and in favour of the users [277]. Thereby, the PTB is “intended to enable consumers to

control their own digital persona” [278]. For the situation of SNSs, one can visualise the role

of the PTB as follows: An individual stores her PD, such as her name, address, phone number,

birthdate, email, and a profile photo in the PTB which appears best to her in terms of data

security and privacy. Then she decides to use a certain SNS and as a payment for the usage, she

allows the SNO access to some of her PD via the PTB. Therefore, she logs in the SNS via the

PTB and can then manage the SNO’s PD access within the PTB’s interface. The idea of PTB

services is already available in modifications (e.g. as open standard and decentralized

authentication protocol OpenID which allows users to login with their Yahoo, Microsoft or

Google accounts into third services and manage their data access over the corresponding

interface) [279]. In addition, FB offers a similar service which allows its users to login to

external applications with their FB accounts. However, to this day and to the author’s best

knowledge no successful external PTB with a focus on data security and user privacy exists.

Nonetheless, the existence of external PTBs as intermediaries between the SNS user and the

SNO is assumed in the following. Such a PTB could be created by governmental business

creation or state ordered corporate unbundling of an SNS. The effects of PTBs on the SNS

market would be similar to the effects of data portability (cf. subchapter 5.2.2); user switching

costs would decrease and users could easily register to different SNSs via one PTB account,

which may increase MHG behaviour. Assuming that the PTBs act in the best interest of their

users, they can work as a representative for them and negotiate with SNOs to increase data

security and lower the price of SNS usage in terms of less demanded PD. This reduces IA and

provides a solution to the principal-agent dilemma described in subchapter 3.2 because as more

users of a specific SNS a PTB unites, as more bargaining power the PTB gains for the

negotiation with the SNO. In contrast to a single user, the PTB can control the data access and

usage of an SNS. In this way, strong PTBs uniting many SNS users would force SNOs to offer

data-minimising services and drive SNOs into competition for user privacy and data security.

However, the SNS direct network effects on the user side would be unbroken and PTBs would

5 Social Network Services: Competition and Privacy 131

likely have less influence on the privacy and data security practices of existing SNSs with large

user bases such as FB.

The two obstacles for independent PTBs are their technological feasibility and their market

role. First, using an SNS requires much more PD than the name and email of a user. Users state

a list of their friends, like and follow celebrities and brands, and interact with each other via

personal messages and postings. If all this data had to be transferred via the PTB, the PTB

would not only have to reflect the SNSs in functionality, but the transfer would also require an

API and a standard open data format as in the case of SNS interoperability (cf. subchapter

5.2.3). While FB allows access to friend lists and the publication of newsfeed posts by external

apps via API, sending personal messages still requires FB or its applications. Furthermore,

editing friend lists and SNS profiles still requires using the FB website or the corresponding

application.33 Berners-Lee put explained the problem, stating that each SNS “is a silo, walled

from the others […] a closed silo of content, and one that does not give you full control over

your information in it” [260]. It is unreasonable for SNOs to give managing access to third

parties such as PTBs, because it is against their economic and data security interests. The walled

data and the privilege of analysing it for commercial purposes is the business model core of

SNOs (cf. subchapter 5.1). Furthermore, attacks on a market-leading PTB could easily lead to

a data breach in many SNSs. Even if SNOs were forced by regulations to allow managing access

for PTBs, the development and implementation of a wide-ranging standard data format for SNS

information and its API would represent a large effort in time and money for SNOs and PTBs

(cf. subchapter 5.2.3).

The other challenges for independent PTBs are their business model and market role. If PTBs

are not government-funded (e.g. as a part of consumer protection), they require a profitable

business model. Two plausible solutions exist, either the PTBs can charge users directly for data

security and privacy (e.g. privacy-friendly mail services or data security applications)34 or they

adopt the SNO business model and utilize the entrusted PD to stay profitable. The first option

would reduce their possible customer base, because the user willingness to pay for privacy and

data security is relatively small [73]. The second business model would drive PTBs into a

conflict of interest, because they have to advocate for data protection and minimization in the

33 cf. https://www.facebook.com/about/privacy/your-info (accessed 20.10.2017).

34 mailbox.org; posteo.de; boxcryptor.com; (accessed 07.12.2017).

132 5 Social Network Services: Competition and Privacy

interest of their costumers, but require enough PD to operate profitably. Regarding their market

role, PTBs would act as MSPs similar to the current role of SNSs, with the SNS users as

customers on the one side and the SNSs on the other (see Figure 18). As an intermediary for

PD, PTBs would be subject to similar effects as SNSs (cf. subchapter 2.2). After creating a PTB

account and entering all their PD, users would suffer from lock-in effects and corresponding

switching costs barriers. In addition, direct network effects on the user side would occur as

described above, and these combined effects could lead to monopoly tendencies similar to those

observed at the SNS market (cf. subchapter 5.1). A PTB with a large user base would gain

strong market power to negotiate the best terms for its users, which would then attract new

users. In addition, users may struggle to switch to another PTB due to the effort of PD transfer

and personal setting adjustments they put into their existing PTB account. Furthermore, SNOs

are interested making the situation for PTBs as complicated as possible to ensure their own

business model and keep their users from using PTBs. Finally, the implementation of PTBs

would force SNOs to focus on their second competitive factor besides PD and increase the

competition for user time spent in their SNS.

In summary, PTBs would force SNOs in a competition for data minimization and security, but

also for user time and attention within their SNSs. They would increase transparency, data

control, and user privacy, and decrease users switching costs for SNSs. However, the

implementation of PTBs is difficult, due to uncertainties in financing as well as the costs and

effort to develop a corresponding data format for SNS interactions. PTBs would act as MSPs

similar to SNSs, and therefore lack the same switching costs, lock-in and network effects on

the user side, including tendencies for market monopolies. The impacts of PTBs on the SNS

markets are illustrated below in Figure 19.

5 Social Network Services: Competition and Privacy 133

Figure 19. Privacy Trust Bank: Market Role and Impacts.

5.2.5 Further Ideas

Several other ideas for the economic respective governmental regulation of privacy for SNSs

are in discussion. According to their impact on privacy as a competition factor, these concepts

are addressed and analysed briefly in the following paragraphs, without any claim to

completeness.

In the legal studies data security and privacy rights have recently been compared to

environmental protection law. The analogy is that “like the exhaust and use of chemical

compounds, the omnipresent generation and subsequent use of personal data can impact

individuals as well as society as a whole” [280]. The idea arising from that analogy is creating

a data tax similar to ecological taxation, which taxes businesses for their amount of gathered

PD. A concept which was proposed in 2013 by the French government [281]. However, this

idea should not be confused with the notion of internet taxes, which describes taxing internet

access or subject e-commerce to an extra levy [282]. The implementation of a data tax has some

obstacles. For example, globally acting online businesses such as SNSs can store their data in

any place in the world, and it is complicated for a single nation to measure the stored data of

their citizens. Moreover, it is difficult to determine whether the gathered data is PD or not.

134 5 Social Network Services: Competition and Privacy

Therefore, the simplest way is to tax the generated data flow to the business is on a national

level.

The economic impacts of such taxation are the incentives for online businesses to embrace data

minimization and for SNOs to gather the most valuable PD. The tax could also cause SNSs and

other online services to implement more targeted advertisement or resell the gathered PD to

compensate for the tax. Furthermore, the data tax constitutes a market entry barrier for

newcomers. Big players such as FB or Google started offering their service and gathering PD

long before developing a profitable business model [102, 283]. A data tax would set an incentive

for online services to perform data minimization, but would not necessarily lead to more user

privacy or make it a competitive factor. However, a data tax creates a market entry barrier and

solidifies the current monopoly tendency of the SNS market (cf. subchapter 5.1).

In his book “Who owns the future” Lanier proposed compensating internet users for the profit

they create by disclosing PD [18]. Apart from the technical problem of linking individual

internet or SNS users to the share of profit caused by their revealed PD, this proposal also

contains economic risks. As other scholars have argued, a compensation system for PD would

lead to an economization of PD disclosure [284]. Users would have the incentive to disclose as

much PD as possible to SNSs and other online services to increase their chance for high

compensation. Similar to the “predatory ads” pointed out by O’Neil [14], this aims at the less

privileged population to contribute all their PD to online services to gain a share of the

compensation. Thus, a compensation system for PD would undermine the human rights aspect

of privacy and jeopardise the concept of privacy in general [15, 284].

As stated by Evans, web-based businesses are likely to become a target of antitrust campaigns

due to their character as Schumpeterian industries [285]. The utilisation of network effects to

gain and secure a market-dominating position in SNS markets shows similarities to the

backgrounds of the antitrust cases against IT firms such as IBM and Microsoft [285, 286].

Furthermore, because of the internet market’s development speed, the MSP character of most

of its participants, and the interconnectedness including dependencies and rivalries, antitrust

complaints are likely to occur [115, 285]. For instance, a competitor of FB could argue that it

cannot reach its potential customers due to FB’s network effects or its non-existent

interoperability (cf. subchapter 5.2.3).

5 Social Network Services: Competition and Privacy 135

Apart from the legal difficulties of proceeding with an antitrust complaint against an MSP, the

economic effects of a successful antitrust campaign and the resulting company unbundling are

of interest. As indicated in subchapter 5.2.4, the unbundling of an SNS could create a company

that is solely responsible for the management of users’ digital persona. Such a firm can evolve

to a PTB, which might increase user privacy, depending on the business model (cf. 5.2.4).

Furthermore, an unbundling of the SNS market leader would create an opportunity for the

evolvement of a privacy-friendly competitor or increase the markets competition in general,

which would strengthen all SNS competitive factors including trust and the related user privacy.

However, another possible outcome of an antitrust complaint is governmental ordered

interoperability, including the privacy enhancing impacts illustrated in subchapter 5.2.3.

Antitrust campaigns on the SNS market might, but not necessarily must, have a positive

influence on user privacy. Without the previously illustrated PTBs or interoperability as an

outcome, they can be considered as privacy neutral and they have no impact on user privacy as

a competitive factor for the SNS market.

The last proposal to discuss is the German idea of an algorithm “TÜV”35 as recommended by

the Chaos Computer Club, the German consumer advice centres, and the German Minister of

Justice Heiko Maas [223]. The idea is to create a governmental audit authority to check BD

algorithms in advance when they make sensible decisions about consumers autonomously. The

aims of the algorithm “TÜV” are to make data collection and analysis more transparent for

consumers and affected persons, and to prevent distorting effects by automated decisions in BD

[223, 287]. Although the most common examples of the need for an algorithm “TÜV” are

decisions made by autonomously driving cars, such an authority could also have impacts on

user privacy in SNSs. A stately audit would ensure that SNOs obey the GDPR and implement

the corresponding data security and privacy elements (cf. subchapter 4.2). It could also prevent

SNS advertisement algorithms from being used for religious or racial discrimination as

happened with FB (cf. [207]). Furthermore, the increase in algorithm transparency would force

the business into a competition for less discriminating and more trustworthy algorithms [288].

Thus, an “algorithm TÜV” would not directly make user privacy a competition factor, but could

enhance privacy via improved transparency and the competition for trustworthy processes.

35 “TÜV” is a German abbreviation standing for “Technischer Überwachungsverein” that is translatable to

“technical monitoring association”. The association is responsible for the monitoring of cars and other

technical products in Germany (see https://www.tuv.com/; last access 26.11.2017).

136 5 Social Network Services: Competition and Privacy

5.3 Conclusion

In subchapter 5.1, the challenge of user privacy in SNSs is investigated from an economic

perspective. The aim was to answer R4.a and analyse whether competition between SNSs tends

to decrease or enhance user privacy. Therefore, it is first clarified that SNSs are MSPs with at

least two sides: users and advertising clients. Furthermore, the traded goods within SNS markets

are characterized from the three viewpoints of users, advertisers, and SNOs. It is also shown

that SNS membership and usage are club goods from a user perspective, while targeted

advertisement is a private good from an advertiser and SNO viewpoint. Moreover, SNOs

compete for advertisers on one side, and on the other side they compete for users’ membership

and more intensely for users’ time. To attract the advertisers, SNOs must maximize their

targeting for advertisements and the time users spend within their platform. Thus, they have the

incentive to gather as much PD from users as possible and enhance their platform with

additional content, features and services to bind users’ attention and spend time in the SNS.

All analysed factors of competition in the SNS environment contain unilateral privacy

threatening aspects, except for the factor of user trust in an SNS, which is gained by

implementing privacy controls to get users to reveal more data. Furthermore, the fact that users

do not pay in monetary terms for SNS usage but that advertising clients are the crucial revenue

source for SNOs suggests that users might be discriminated against and overpriced in terms of

PD disclosure. The validation by the MSP theory where the MHG side is overpriced and the

single homing side is subsidized provides no sufficient results to verify or reject this

assumption. In summary, this analysis shows that competition in the SNS environment does not

generally harm or improve user privacy. However, the latter seems less likely, unless the

competition for users’ trust outweighs all the other privacy divergent aspects. The case of

competition in the internet MSPs is complex and current statistics do not provide enough data

to answer the question of whether advertisers or users are subsidized by the SNOs.

Nevertheless, the analysis of RQ4.a about whether competition provides a solution for user

privacy in SNSs shows that competition can be assumed to have a negative influence on user

privacy at the present stage.

Consequently, in subchapter 5.2 the most promising parts of the GDPR and supplementary

concepts for governmental regulation were analysed for their opportunities to make user privacy

a competition factor in the SNS market to answer RQ4.b: what interventions can direct the SNS

5 Social Network Services: Competition and Privacy 137

market dynamics to enhance user privacy? The right to be forgotten as a consistent approach

for the “right to erasure”, the “right to data portability”, the concept of interoperability, and the

model of PTBs were evaluated, and further prominent ideas are briefly addressed. It was found

that the concept of interoperability seems to be the most promising approach to make user

privacy a competition factor for the SNS market, particularly in combination with the “right to

data portability” to enhance competition in general. Interoperability disperses the direct

network effects within users in SNSs and breaks the lock-in effect, especially in combination

with data portability which gives each user the possibility to take her data and walk away to use

another SNS. When network effects are no longer a competition factor, the remaining factors

of usability and trust are strengthened, where trust consists of data security and user privacy.

Thus, user privacy gains a more prominent role in SNS market competition. However, several

economic and technical obstacles to data portability and interoperability exist, as both require

a standard format for data transfer between SNSs. While promising approaches are on-going,

their implementation would be against the economic interest of the market-dominating SNS

and would require a large supporter base among its competitors or direct legal intervention to

be enforced (cf. subchapter 5.2.3).

139

6 Does Economic Competition and Regulation Provide a

Solution for Privacy in SNSs?

The results of this dissertation are proposed as a contribution to analysing user privacy in social

network services (SNSs) and providing proposals for its enhancement via economic and

information system means as well as an economic evaluation. Therefore, it is structured into

five preceding chapters that build on one another and address the dissertation’s research

questions, RQ1 to RQ4. Several open research challenges and topics that are out of the scope

of this dissertation remain. Consequently, this chapter first concludes with a summary and the

main results. Based on the main results’ discussion, this chapter then suggests directions for

future research on user privacy in SNSs.

6.1 Summary and the Main Results

In addition to providing the introduction and presenting the RQs, chapter 1 establishes the first

research frame for the dissertation by describing the character of SNSs as MSPs and illustrating

their business model. It demonstrates that SNOs rely on users to reveal their PD to analyse it

for the purpose of catering targeted advertisements to their business clients. Furthermore, the

chapter provides a state-of-the-art overview of privacy research and its history, approaches,

theories, and typologies. The current mainstream approach to user privacy in SNSs is static,

focusing mainly on legal, behavioural, and computer science. However, this dissertation

provides an economic and information systems approach, addressing the market dynamics that

are influencing user privacy.

Chapter 2 answers the research question RQ1 about the economic participants, architecture,

government, and effects in the SNS ecosystem at the system level, and how they interrelate with

user privacy. The chapter’s aim is to describe the market dynamics on privacy at a system level

and set the second framework for the following investigations. Therefore, the interrelations

within SNSs and the enclosing market structure on user privacy were analysed with SD

modelling. With a literature review investigating SNSs as MSPs, these interrelations were

identified, following a typology of Staykova and Damsgaard [44]. The SD approach provided

a qualitative model and divided the SNSs and their ecosystem interrelations into a core and a

periphery. The model provided insight to understand the influence factors on SNS users’ UDD

140 6 Does Economic Competition & Regulation provide a Solution to Privacy in SNSs?

and their SNS usage in general. The analysis showed that users’ UDD is mainly driven by the

time users spend within a network and the content they add to it, which is positively influenced

by the number of SNS users and the provided SNS features. Furthermore, the time users spend

on SNSs is influenced by the SNS features and by users’ trust in the SNO. This trust is gained

by implementing transparent and effective privacy policies and controls. UGC and UDD are

positive factors for the SNOs profiling capabilities which build the core of their profitable

business model. Finally, the model revealed that user MHG behaviour has a positive influence

on UGC and indirectly on the SNS market competitiveness and secondary negative impacts on

user privacy. Matching these theoretical findings with empirical evidence revealed the MHG

feedback-loop: users’ MHG behaviour strengthens market pressure, which leads to SNOs

implementing more platform features and users spending more time and disclosing more PD in

the dominant network. This feedback-loop facilitates a winner-takes-all outcome for the SNS

market in the long term.

The case study of PD as a PM in SNSs is presented in subchapter 3.1, and it addresses research

question RQ2.a about whether SNS users’ PD disclosure is interpretable as an economic

transaction. The challenge of user privacy in SNSs was approached on an individual level. The

study illustrates that the case of user privacy in SNSs can be understood as an economic

procedure with the legitimate interpretation of the PD disclosure by the users to the SNO as

non-monetary PM for the SNS usage. Regarding this illustration, the economic problem and

the user privacy dilemma becomes evident: in the default of transparency, SNS users lack IA

against SNOs regarding the amount and usage of their disclosed data. The study shows that

users underestimate both the SNO granted rights for data usage and the SNS usage price in

terms of PD. In addition, the study’s experimental part proved that SNS users show price

sensitivity regarding PD. This means that users chose to use less features of SNSs to avoid data

disclosure under the experimental conditions, which indicates that they cannot bring their

privacy preferences into force under present terms.

Subchapter 3.2 answers research question RQ2.b: Can the user privacy dilemma in SNSs be

modelled in classic economic terms? To illustrate the reasons for SNS users’ IA, the contract

conclusion between users and SNOs was modelled as a principal-agent dilemma by contract

theory. The model identified asymmetries in information and in power as leverage points where

SNS users are disadvantaged against SNOs, and which hinder the economic balancing of SNO

profit-seeking and privacy within the SNSs market. Under the distinction of three different

6 Does Economic Competition & Regulation provide a Solution to Privacy in SNSs? 141

cases, it was shown that in non-monopolistic SNS markets, privacy is perceived as a

competitive factor by users and SNOs. Thus, in perfect competition SNOs have the incentive

to provide users with increased transparency and control regarding their PD. Unfortunately, the

current SNS market situation is almost monopolistic in favour of FB [162]. The presented

principal-agent model revealed that under these conditions, regulatory intervention is necessary

to enhance user privacy as a competition factor in the current SNS market. The analysis leads

to the conclusion that a transparency-fostering regulation in combination with ex ante and ex

post TET (i.e. accountability instruments) could achieve a privacy-friendly balance of SNO

profit-seeking and privacy in the current SNS market.

These chapters lead to the conclusion that the current SNS market does not provide sufficient

incentives for SNOs to address the problem of user privacy. Thus, chapter 4 addresses RQ3

about whether governmental regulation provides a regulatory solution for user privacy in SNSs

and its influence on the SNS market. This chapter analysed the GDPR as an example case for

governmental regulation. Therefore, chapter 4 was divided in two subchapters. At first,

subchapter 4.1 provided a multidimensional privacy framework combining state-of-the-art

privacy research results and theories regarding SNSs as a tool to evaluate the GDPR. The

framework consisted of three dimensions: privacy condition, privacy type, and time.

Accordingly, in subchapter 4.2, the framework is used to evaluate the GDPR. The regulation

does not cover all identified privacy dimensions, especially the ex ante dimensions of user

privacy. Beyond that, it was found that the GDPR only provides the incentive for SNOs to

deliver the minimum level of required privacy and does not enforce user privacy as a

competition factor in the SNS market. In addition, parts of the regulation provisions are

technically difficult to implement.

The classic economic theory assumes that the best market outcomes are derived by free

competition. Consequently, chapter 5 addresses the remaining research questions RQ4.a and

RQ4,b about whether competition can enhance user privacy in SNSs and what interventions

may direct the SNS market dynamics to enhance it? Subchapter 5.1 addresses RQ4.a and starts

with a clarification that SNSs constitute MSPs with at least two sides: users and advertising

clients. The traded goods within the SNS market were then characterised from three different

viewpoints: users, advertisers, and the SNO. The results showed that SNS membership and its

usage are club goods from a user perspective, while targeted advertisement constitutes a private

good from an advertiser and SNO viewpoint. Accordingly, SNOs compete for advertisers and

142 6 Does Economic Competition & Regulation provide a Solution to Privacy in SNSs?

for user membership, and more intensely for users’ time. Providers maximize their targeting of

advertisements to attract the advertising clients. SNOs have the incentive to gather as much PD

from users as possible. Therefore, they enhance their platform with additional features and

services as well as content to bind users’ attention and time to their SNS. The analysis of all

factors of SNS competition revealed that they all contain privacy threatening aspects, except

for the factor of user trust. In addition, the analysis suggests that SNS users might be

discriminated against the advertisers, and therefore overpay in terms of PD disclosure and

utilisation. However, the comparison to market statistics provided no sufficient results to verify

or reject this hypothesis. In summary, subchapter 5.1 shows that the SNS market competition

does not generally harm or improve user privacy. Yet, the privacy harming scenario seems more

likely, unless the competition for users’ trust outweighs all the other privacy-divergent

competition factors. Correspondingly, the inclusion of the current SNS market development

indicates that competition can be assumed to have a negative influence on user privacy at the

present stage.

Correspondingly, subchapter 5.2 analyses the most promising attempts of the GDPR as well

as supplementary concepts for governmental regulation for their ability to make user privacy a

competition factor in the SNS market to answer RQ4.b. The most promising approach was the

concept of interoperability. Particularly in combination with the GDPR enforced “right to data

portability”, it dissolves the direct network effects within users in SNSs and breaks the lock-in

effect. This gives users the opportunity to bag their data and choose their favourite SNS with

regard to the remaining the competition factors of usability and trust. Nonetheless, the concepts

of interoperability and data portability are subject to technical and economic obstacles. First, a

standard format for SNS data transfer and usage is needed to enable data portability and

interoperability. While promising concepts and approaches for such an open standard format

exist (e.g. OpenSocial; cf. subchapter 5.2.3), its broad and effective market implementation

depends on the largest market participant. However, the implementation is not in the economic

interests of the market dominating player because effective data portability and interoperability

would intensify competition and endanger its profits. Thus, the enforcement of the two most

promising concepts need regulatory intervention or a broad base of SNOs supporting,

implementing and further developing them to put the market leader under competition pressure.

The dissertation at hand illustrates that user privacy in online social networks is a problem of

economic market dynamics. Information and power asymmetries hinder users from enforcing

6 Does Economic Competition & Regulation provide a Solution to Privacy in SNSs? 143

their privacy preferences. Furthermore, network effects and switching costs lock them to the

market leading network. This dissertation provides two different ways to economically model

these problems: first, as an interconnected system of different effects with system dynamics

modelling and second, as a principal-agent dilemma in contract theory. Based on this, an

analysis of the European general data protection regulation revealed that this legislation will

not lead to an improvement in user privacy by market competition, but will create an incentive

for the providers to implement the minimum required data security and privacy standard.

Furthermore, subsequent analysis of the present social network market showed that user privacy

is unlikely to be achieved via competition. The following consideration of further approaches

of regulatory interventions illustrated that interoperability and data portability are best suited to

enforcing user privacy as a competition factor in the social network market, despite their

technical and economic difficulties.

6.2 Implications for Future Research

The previous subchapter presents a summary of the addressed research questions and explains

this thesis’ corresponding contributions. Due to the economic and information systems

viewpoint of this dissertation, several unanswered questions and challenges remain that are out

of its scope. Consequently, the main limitations and open questions are discussed in this

subchapter. It illustrates these challenges regarding user privacy in social networks arising in

different research fields and argues for a multidisciplinary research agenda.

The consumption of social media has drastically changed over the last ten years. While FB

started as an SNS that was primarily reached and used through its website on desktop

computers, SNS users now tend to consume social media primarily through their smartphone

and the corresponding SNS apps [289]. Due to the access rights for the smartphone memory

and functions, these apps can reveal more PD than the average use of SNS via a browser and

desktop computer [194, 290, 291]. This dissertation does not include special consideration of

SNS privacy on mobile devices, as it provides a higher level of economic analysis. Thus, mobile

SNS user privacy should receive extra attention in future research, which could build on the

results and answers derived by the dissertation at hand.

The legal and technical implementations of the competition-enhancing measures mentioned in

subchapter 5.2 could be of interest for further research. Interoperability and portability require

144 6 Does Economic Competition & Regulation provide a Solution to Privacy in SNSs?

a widely adapted standard for social media, which still lacks best practice examples with regard

to feature-rich SNSs such as FB and obeying adequate user privacy. Operational standards could

lead the way to SNSs as interacting running systems of a social web instead of as walled silos

of social content [260]. Furthermore, they could enable the serviceable implementation of

PTBs, which are also interesting topics for further computer science and information systems

research. To the author’s best knowledge, there is no working PTB prototype yet. As described

in this dissertation, PTBs could function as an intermediate between users and SNSs to secure

data security and user privacy by the users of their PD. Thus, the solutions provided in

subchapter 5.2 could be used as a starting point for further research.

Aside from the economic focus of this dissertation, other economic research questions should

be addressed in the future regarding user privacy and SNS business models. While chapter 5

showed that competition between SNSs does not enhance user privacy at the current state,

future research could aim to invent alternative SNS business models that include the

compatibility of privacy and profitability. From a macro-economic viewpoint, it could be

worthwhile to evaluate whether successful online services such as FB and Google contribute to

global economic imbalances. Their profit-to-employee ratio is conspicuously high as is the

average paid salary, which has already led to local social upheaval [292]. Some scholars even

suggest that users are being expropriated by BD businesses [19]. However, these allegations

still require scientific proof.

As mentioned in the introduction of this dissertation, the topic of user privacy has its human

rights aspect in addition to other intersections with political science. Recent events revealed

that SNS targeting can address the darkest and deepest feelings of users due to their PD

disclosure [207]. This enables the opportunity to exploit the financial distress of individuals and

opens the door for political campaigning [12, 14]. Thus, one of the most urgent questions is

about whether SNSs such as FB can be used to manipulate democratic elections and how this

could be prevented to sustain the democratic society [13]. In addition, this precise targeting

might make SNS users more vulnerable to consumption and distribution of fake news, primarily

but not only within political campaigns. Connected to this, the role of internet user privacy and

its adequate level of protection is open to revaluation from a political science viewpoint.

i

Appendix

Literature Review Appendix of Subchapter 2

Authors Year Title Topic Categories

Staykova; Damsgaard 2015 A Typology of Multi-sided Platforms: The Core and the Periphery MSP Framework

Haucap; Heimeshoff 2013 Google, Facebook, Amazon eBay: Is the Internet driving to Competition or Market Monopolization?

MSP Discussion

Armstrong 2006 Competition in Two-Sided Markets MSP Modelling

Choi 2010 Tying in Two-Sided Markets with Multi-Homing MSP Modelling

Evans; Schmalensee 2013 The Antitrust Analysis of Mulit-Sided Platform Businesses MSP Discussion Rochet; Tirole 2003 Platform Competition in Two-sided Markets MSP Modelling

Eisenmann; Parker; Alstyne

2006 Strategies for Two-Sided Markets MSP Discussion

Kwon; Oh; Kim; 2015 One-Sided Competition in Two-Sided Social Platform Markets? An Organizational Ecology Perspective

MSP Discussion

Zhang; Sarvary; 2011 Social Media Competition: Differentiation with User-Generated Content SNS Modelling

Mital; Sarkar; 2011 Multihoming Behavior of Users in Social Networking Web Sites: A theoretical Model

SNS Modelling

Hyytinen; Takalo; 2004 Multihoming in the Market for Payment Media: Evidence from young Finnish Consumers

MSP Emprical

Doganoglu; Wright; 2006 Multihoming and Compatibility MSP Modelling Evans; Schmalensee; 2008 Markets with Two-Sided Platforms MSP Discussion Kane; Alavi; Labianca; Borgatti

2014 What's different about Social Media Networks? A Framework an Research Agenda

SNS Framework

Knoll 2016 Advertising in Social Media: A Review of Empirical Evidence SNS Emprical

Zimmermann; Nolte 2015 Towards balancing Privacy & Efficiency SNS Modelling Nolte 2015 Personal Data as Payment Method in SNS and Users' concerning Price Sensitivity SNS Emprical

Lawani 2016 Improving users trust through friendly Privacy Policies - an empirical Study Privacy Emprical Ahn, Duan, Mela 2011 An Equilibrium Model of User Generated Content MSP Emprical

Tucker 2012 Social Advertising SNS Discussion Zittrain 2009 The Future of the Internet - and how to stop it SNS Discussion

Taylor 2004 Consumer Privacy and the Market for Customer Information Privacy Modelling Tucker 2015 Economics of Privacy and User-Generated Content Privacy Discussion

Kox, Straathof, Zwart 2014 Targeted Advertising, Platform Competition and Privacy MSP Modelling Martin 2013 Transaction Costs, Privacy and Trust SNS Discussion Litt 2013 Understanding Social Network Site Users' Privacy Tool Use SNS Emprical

Jerome 2013 Buying and Selling Privacy: Big Data's different Burdens and Benefits Privacy Discussion Stutzman, Gross, Acquisti 2012 Silent Listeners: The Evolution of Privacy and Disclosure on Facebook SNS Emprical

O'Brien, Torres 2012 Social Networking and Online Privacy: Facebook Users' Perceptions SNS Emprical Lin, Lu 2011 Why People use Social Networking Sites: An empirical Study integrating Network

Externalities and Motivation Theory SNS Emprical

Zhang, Sun, Zhu, Fang 2010 Privacy and Security for Online Social Networks: Challenges and Opportunities SNS Discussion Krasnova 2010 Online Social Networks: Why we disclose SNS Emprical

Acquisti, Gross 2013 Imagined Communities: Awareness, Information Sharing, and Privacy on the Facebook

SNS Emprical

Evans, Schmalensee 2010 Failure to Launch: Critical Mass in Platform Businesses MSP Modelling

Boyd; Ellison 2007 Social Network Sites: Definition, History, and Scholarship SNS Discussion Westin 2003 Social and Political Dimensions of Privacy Privacy Discussion

Beuscart; Mellet 2008 Business Models of the web 2.0: Advertising or the Tale of Two Stories SNS Discussion Tucker 2014 Social Networks, Personalized Advertising, and Privacy Controls. MSP Empirical

Schudy; Utikal 2015 'You must not know about me': on the Willingness to share Personal Data Privacy Empirical Barary Savadkoohi 2012 Personalized Online Promotions - Long-term Impacts on Customer Behavior MSP Discussion Debatin, Lovejoy, Horn, Hughes

2009 Facebook and online Privacy: Attitudes, Behaviors, and unintended Consequences

SNS Discussion

Table 15. Breakdown of the Literature Review.

ii Appendix

Survey & Experiment Appendix of Subchapter 3.1

Survey Question Possible Answers Nutzen Sie Facebook? Ja / Nein

Haben Sie schon einmal Ihre Privatsphäre Optionen bei Facebook geändert? Ja / Nein / Weiß nicht

Inwieweit stimmen Sie folgenden Aussagen zu:

Facebook ist ein vertrauenswürdiger Dienstleister.

Facebook hält sich an die gesetzlich vorgegebenen Normen und Richtlinien um meine Daten optimal zu schützen.

Die Datenverwendungsrichtlinien von Facebook sind ausreichend um meine persönlichen Daten zu schützen.

Die Nutzung von Facebook gehört bei mir zum Alltag.

Stimme gar nicht zu / Stimme zu / weder noch /

Stimme zu / Stimme sehr zu / Keine Angaben

Wie viele Freunde haben Sie circa in Facebook? Zahl zwischen 0 und 2000.

Welche der folgenden Funktionen von Facebook nutzen Sie regelmäßig? Chatten;

Messenger (Chat-App für Smartphones);

Neuigkeiten posten & Neuigkeiten meiner Freunde bzw. abonnierten Seiten lesen, liken, kommentieren;

Veranstaltungen erstellen, dazu einladen bzw. Teilnahme bestätigen;

Fotos hochladen & teilen;

Videos hochladen & teilen;

Spiele spielen oder Musik hören über entsprechende Apps innerhalb Facebooks;

Keine der genannten Funktionen; Angenommen für das Aktivieren und Nutzen – der jeweiligen Funktion – würde

Facebook folgende Daten von Ihnen auswerten und für die eigenen Zwecke der personalisierten Werbung nutzen: - Daten nach Gruppenzugehörigkeit –

Würden Sie die Veranstaltungsfunktion aktivieren?

Würden Sie die Timeline (Neuigkeiten-Funktion) aktivieren? Würden Sie die Foto-Funktion aktivieren?

Würden Sie die Video-Funktion aktivieren?

Würden Sie die Funktion für Spiele und Musik aktivieren?

Ja / Nein / Weiß nicht

Welcher ist Ihr höchster erreichter Bildungsabschluss? Keine Angaben;

Haupt- (Volks-) Schulabschluss;

Realschul- oder gleichwertiger Abschluss;

Fachhochschulreife oder Hochschulreife;

Bachelor Abschluss;

Master oder gleichwertiger bzw. höherer Abschluss;

Sie sind … Weiblich / männlich / keine Angabe

Bitte geben Sie Ihr Alter an! Zahl zwischen 15 und 99;

Table 16. Survey Design.

309 Yes

11 No

Table 17. Distribution FB Users

137 Female

164 Male

8 No answer

Table 18. Gender Distribution.

Appendix iii

5 Haupt- (Volks-) Schulabschluss

30 Realschul- oder gleichwertiger Abschluss

133 Fachhochschul- oder Hochschulreife

50 Bachelor Abschluss

75 Master oder gleichwertiger bzw. höherer Abschluss

16 No answer

Table 19. Educational Achievement Distribution.

Variable Explanation

func.all Total of FB functions used by the participant.

gr.func.all Total of functions chosen by the participant during the experiment.

func.diff Difference between gr.func.all and func.all for the participant.

exp.gr Assigned experiment group (1, 2, 3, or 4).

pr.awa The participant’s level of privacy awareness.

gen.male Gender of the participant (=1 if male).

educ Educational level of the participant.

age The participant’s age.

fb.use Degree of Facebook usage of the participant.

fb.friends Participant’s number of Facebook of friends.

Table 20. Variable Overview

Surevey Raw Data of Subchapter 3.1: CASE "SERIAL" "REF" "QUESTNNR" "MODE" "STARTED" "A101" "A201_01" "A201_02" "A201_03" "A203_01" "A203_02" "A203_03" "A202_01" "A202_03" "A202_04" "A301_01" "A302_01" "A302_01a" "B101_01" "B101_02" "B101_03" "B101_07" "B101_04" "B101_05" "B101_06" "B101_08" "C101_01" "C102_01" "C103_01" "C104_01" "C105_01" "C106_01" "C107_01" "D101" "D102" "D103_01" "D103_01a" "C201_01" "C202_01" "C203_01" "C204_01" "C205_01" "C206_01" "C207_01" "C301_01" "C302_01" "C303_01" "C304_01" "C305_01" "C306_01" "C307_01" "C001_01" "C002_01" "C003_01" "C004_01" "C005_01" "C006_01" "C007_01" "G101_01" "FINISHED" "LASTPAGE" "MAXPAGE" "MISSING" "MISSREL" "DEG_MISS" "DEG_TIME" "DEGRADE"

109 "UmfrageOSN" "interview" 1 T F F F F T 2 2 2 4 580 F F F F T T F F F 4 2 T 1 1 2 2 1 1 "2" T 11 0 0 0 0 100 100

119 "UmfrageOSN" "interview" 1 T F F F T F 1 2 2 5 800 F T F T T F F F F 3 2 20 F 1 1 2 2 1 1 1 "4" T 11 0 0 0 0 100 100

122 "UmfrageOSN" "interview" 1 T F F F T F 1 2 2 4 175 F T T T T F F F F 1 1 2 2 1 5 3 27 F "1" T 11 0 0 0 0 100 100

125 "UmfrageOSN" "interview" 1 T F F F F T 2 2 1 5 105 F T T T F T F F F 2 1 22 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

133 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 5 380 F F F T T T F F F 2 2 2 5 1 27 F "1" T 11 0 0 0 0 100 100

135 "UmfrageOSN" "interview" 1 T F F F T F 2 3 1 4 1200 F T T T T T F F F 4 2 23 F 2 2 2 1 1 "4" T 11 0 0 0 0 100 100

136 "UmfrageOSN" "interview" 1 T F F T F F 1 2 4 400 F T F T F T F F F 3 1 21 F 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

142 "UmfrageOSN" "interview" 1 T F F T F F 4 2 3 5 1466 F T F T T T F F F 4 1 25 F 2 2 1 1 2 2 2 "4" T 11 0 0 0 0 100 100

143 "UmfrageOSN" "interview" 1 T F F T F F 3 3 3 4 800 F T T T T T T F F 3 2 20 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

144 "UmfrageOSN" "interview" 1 T F F F T F 2 1 2 4 120 F F F T T T F F F 2 2 2 2 2 2 1 5 2 40 F "1" T 11 0 0 0 0 100 100

145 "UmfrageOSN" "interview" 1 T F F T F F 2 2 3 5 360 F T F T T T F F F 3 2 19 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

146 "UmfrageOSN" "interview" 1 T F F F T F 2 4 1000 F T T T F T F F F 4 2 29 F 2 2 2 2 2 2 "2" T 11 0 0 0 0 100 100

152 "UmfrageOSN" "interview" 1 T F F F F T 4 5 150 F T T T T T F F F 4 2 25 F 2 2 2 2 2 2 2 "2" T 11 0 0 0 0 100 100

156 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 5 250 F T T T F T F F F 2 2 17 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

157 "UmfrageOSN" "interview" 1 T F F T F F 3 3 2 4 70 F T F T F T F F F 3 1 28 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

158 "UmfrageOSN" "interview" 1 T F F F T F 1 2 2 5 1600 F T T T T T F F F 1 1 1 1 1 1 1 4 2 24 F "1" T 11 0 0 0 0 100 100

159 "UmfrageOSN" "interview" 1 T F F F T F 1 2 1 2 1 F F F F F F F F T 1 1 1 1 1 1 1 3 2 29 F "1" T 11 0 0 0 0 100 100

161 "UmfrageOSN" "interview" 1 T F F T F F 3 3 4 4 340 F T T T T T T F F 4 2 48 F 1 2 2 2 2 2 1 "3" T 11 0 0 0 0 100 100

162 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 2 410 F T T T T T F T F 3 2 26 F 1 "2" T 11 0 0 0 0 100 100

164 "UmfrageOSN" "interview" 1 T F F F T F 2 1 1 4 300 F T T T F F F F F 3 2 21 F 2 2 2 2 2 2 "2" T 11 0 0 0 0 100 100

165 "UmfrageOSN" "interview" 1 T F F T F F 2 2 1 4 200 F F T T T T F F F 3 2 25 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

166 "UmfrageOSN" "interview" 1 T F F F F T 3 3 3 5 200 F T T T T F F F F 2 2 2 2 2 2 1 3 2 23 F "1" T 11 0 0 0 0 100 100

169 "UmfrageOSN" "interview" 1 T F F F T F 3 3 3 2 180 F F T T F F F F F 3 3 24 F 2 2 1 2 1 1 1 "3" T 11 0 0 0 0 100 100

172 "UmfrageOSN" "interview" 1 T F F F T F 3 3 4 4 180 F T T T T F F F F 2 2 2 2 2 2 2 5 2 26 F "1" T 11 0 0 0 0 100 100

173 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 5 400 F T T T T T F F F 5 1 34 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

174 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 350 F T F T T F F F F 2 1 1 1 1 1 3 1 18 F "1" T 11 0 0 0 0 100 100

177 "UmfrageOSN" "interview" 1 T F F T F F 4 3 2 2 500 F F F T F F F F F 5 2 31 F 1 1 1 2 2 2 1 "4" T 11 0 0 0 0 100 100

178 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 5 100 F T F T T F F F F 3 2 21 F 1 1 1 1 1 1 2 "2" T 11 0 0 0 0 100 100

179 "UmfrageOSN" "interview" 1 T F F F T F 2 2 2 4 400 F F T T T T F F F 5 2 52 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

183 "UmfrageOSN" "interview" 1 T F F F T F 2 1 2 4 70 F T F F F F F F F 3 2 22 F 2 2 1 2 1 1 "2" T 11 0 0 0 0 100 100

184 "UmfrageOSN" "interview" 1 T F F F F T 2 3 1 5 150 F T T T T T F F F 1 1 2 2 1 1 2 2 1 17 F "1" T 11 0 0 0 0 100 100

185 "UmfrageOSN" "interview" 1 T F F F T F 2 2 3 4 60 F T F T F T F F F 5 1 28 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

189 "UmfrageOSN" "interview" 1 T F F F T F 2 3 4 129 F T T T T T T T F 2 2 16 F 2 2 1 2 2 2 2 "4" T 11 0 0 0 0 100 100

190 "UmfrageOSN" "interview" 1 T F F T F F 4 4 4 5 1000 F F F T T T F F F 5 2 31 F 1 "3" T 11 0 0 0 0 100 100

191 "UmfrageOSN" "interview" 1 T F F F T F 1 2 1 4 100 F T F T F T F F F 3 2 25 F 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

192 "UmfrageOSN" "interview" 1 T F F T F F 1 2 1 4 500 F T F T F F F F F 2 1 1 1 1 1 1 3 2 23 F "1" T 11 0 0 0 0 100 100

193 "UmfrageOSN" "interview" 1 T F F F T F 2 2 4 4 300 F T T T T T F F F 2 1 17 F 2 2 1 2 2 1 "4" T 11 0 0 0 0 100 100

iv Appendix

195 "UmfrageOSN" "interview" 1 T F F F F T 1 1 1 4 200 F F F T T F F F F 3 2 29 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

197 "UmfrageOSN" "interview" 2 T 3 0 0 0 0 100 100

200 "UmfrageOSN" "interview" 1 T F F F T F 1 2 1 4 450 F T T T F T F F F 1 1 1 1 1 1 1 3 1 24 F "1" T 11 0 0 0 0 100 100

201 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 3 400 F F F F F F F F T 6 2 29 F 1 1 1 1 1 2 "2" T 11 0 0 0 0 100 100

202 "UmfrageOSN" "interview" 1 T F F F F T 2 3 2 2 120 F T F T T F F F F 3 1 21 F 2 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

203 "UmfrageOSN" "interview" 1 T F F F T F 2 2 2 5 600 F T T T T F F F F 2 2 2 4 2 25 F "1" T 11 0 0 0 0 100 100

204 "UmfrageOSN" "interview" 1 T F F T F F 1 2 2 5 900 F T F T T T F F F 5 1 27 F 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

206 "UmfrageOSN" "interview" 1 T F F F T F 1 2 1 2 120 F T F T F F F F F 4 1 25 F 2 1 1 2 2 2 2 "4" T 11 0 0 0 0 100 100

207 "UmfrageOSN" "interview" 1 T F F F T F 1 2 1 4 220 F F F T F F F T F 5 1 45 F 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

208 "UmfrageOSN" "interview" 2 T 3 0 0 0 0 100 100

210 "UmfrageOSN" "interview" 1 T F F T F F 3 3 4 5 650 F F F T T T T F F 5 2 53 F 1 1 2 1 "3" T 11 0 0 0 0 100 100

211 "UmfrageOSN" "interview" 1 T F F T F F 2 2 4 150 F T F T T F F T F 3 1 32 F 1 1 1 1 1 1 2 "2" T 11 0 0 0 0 100 100

212 "UmfrageOSN" "interview" 1 T F F F F F 2 2 1 4 450 F T F T T T T F F 1 1 2 2 2 T "1" T 11 0 0 0 0 100 100

213 "UmfrageOSN" "interview" 1 T F F F T F 1 3 2 5 378 F T F T F T F F F 2 2 18 F 2 1 2 2 2 2 1 "4" T 11 0 0 0 0 100 100

217 "UmfrageOSN" "interview" 1 T F F T F F 3 2 2 5 900 F T F T F T F F F 3 1 20 F 1 1 2 2 2 "3" T 11 0 0 0 0 100 100

219 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 5 260 F T F T T F F F F 2 1 18 F 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

220 "UmfrageOSN" "interview" 1 T F F F T F 2 3 2 5 230 F T F T F F F F F 4 2 24 F 1 1 2 2 2 2 1 "2" T 11 0 0 0 0 100 100

222 "UmfrageOSN" "interview" 1 T F F F T F 1 1 4 600 F T F F T F F F F 1 1 1 1 1 1 4 2 24 F "1" T 11 0 0 0 0 100 100

223 "UmfrageOSN" "interview" 1 T F F T F F 1 3 1 4 300 F T F T T F F F F 3 1 20 F 2 1 2 1 1 1 1 "4" T 11 0 0 0 0 100 100

224 "UmfrageOSN" "interview" 2 T 3 0 0 0 0 100 100

225 "UmfrageOSN" "interview" 2 T 3 0 0 0 0 100 100

227 "UmfrageOSN" "interview" 1 T F F F T F 2 1 2 5 150 F T T T T T F F F 2 2 2 2 2 1 1 4 2 25 F "1" T 11 0 0 0 0 100 100

229 "UmfrageOSN" "interview" 1 T F F T F F 3 2 2 4 300 F T T T T F F F F 5 2 31 F 1 1 2 2 1 1 2 "3" T 11 0 0 0 0 100 100

231 "UmfrageOSN" "interview" 1 T F F F F T 2 2 1 4 550 F T F T T F F F F 3 2 17 F 2 1 1 1 1 "4" T 11 0 0 0 0 100 100

234 "UmfrageOSN" "interview" 1 T F F T F F 3 2 1 5 140 F T T T F F F F F 4 2 32 F 1 1 2 2 1 1 1 "3" T 11 0 0 0 0 100 100

235 "UmfrageOSN" "interview" 1 T F F T F F 3 4 2 4 1000 F F T T F T F F F 1 1 2 2 2 2 2 3 2 31 F "1" T 11 0 0 0 0 100 100

236 "UmfrageOSN" "interview" 1 T F F T F F 2 1 2 5 900 F T T T F F F F F 3 2 20 F 2 2 2 2 1 1 2 "2" T 11 0 0 0 0 100 100

238 "UmfrageOSN" "interview" 1 T F F T F F 3 3 4 5 370 F T T T T T F F F 3 2 21 F 2 1 2 2 2 2 1 "4" T 11 0 0 0 0 100 100

239 "UmfrageOSN" "interview" 1 T F F T F F 2 4 2 4 300 F T T T F T F F F 3 2 18 F 1 1 2 2 1 1 1 "2" T 11 0 0 0 0 100 100

240 "UmfrageOSN" "interview" 1 T F F T F F 4 3 2 5 500 F T T F T F F F F 3 2 20 F 2 2 2 2 2 2 1 "4" T 11 0 0 0 0 100 100

241 "UmfrageOSN" "interview" 2 T 3 0 0 0 0 100 100

243 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 4 450 F T T T T T F F F 3 1 25 F 1 1 2 1 1 2 "3" T 11 0 0 0 0 100 100

244 "UmfrageOSN" "interview" 1 T F F F T F 2 2 2 5 400 F T T T F F F F F 2 2 2 1 3 1 33 F "1" T 11 0 0 0 0 100 100

245 "UmfrageOSN" "interview" 1 T F F T F F 2 3 2 5 270 F T T F F F F F F 1 1 1 1 1 1 1 2 1 17 F "1" T 11 0 0 0 0 100 100

248 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 2 300 F F F T F F F F F 5 2 31 F 1 1 2 2 1 1 1 "2" T 11 0 0 0 0 100 100

251 "UmfrageOSN" "interview" 1 T F F F T F 2 4 2 4 250 F T T T T T F F F 4 2 24 F 2 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

256 "UmfrageOSN" "interview" 2 T 3 0 0 0 0 100 100

257 "UmfrageOSN" "interview" 1 T F F F F T 2 1 1 4 T T F T F F F F F 3 3 T 2 1 1 1 1 1 2 "3" T 11 0 0 0 0 100 100

264 "UmfrageOSN" "interview" 1 T F F F F T 4 4 3 5 245 F T T T T F F F F 2 2 2 2 2 2 2 3 2 33 F "1" T 11 0 0 0 0 100 100

265 "UmfrageOSN" "interview" 1 T F F T F F 1 1 2 5 1050 F T F T T F F F F 3 1 19 F 1 1 2 1 1 1 "4" T 11 0 0 0 0 100 100

266 "UmfrageOSN" "interview" 1 T F F F T T 2 3 2 4 325 F T F T F T F F F 4 1 23 F 1 1 1 "2" T 11 0 0 0 0 100 100

267 "UmfrageOSN" "interview" 1 T F F F T F 2 1 1 5 400 F F F T F T F F F 3 2 23 F 1 1 2 1 1 1 1 "2" T 11 0 0 0 0 100 100

268 "UmfrageOSN" "interview" 1 T F F F T F 1 1 2 4 190 F F T T F F F F F 3 1 21 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

269 "UmfrageOSN" "interview" 2 T 3 0 0 0 0 100 100

271 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 1 170 F F F F T F F F F 1 1 1 1 1 1 3 1 32 F "1" T 11 0 0 0 0 100 100

273 "UmfrageOSN" "interview" 1 T F F T F F 2 1 1 5 300 F F F T T T T F F 5 1 28 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

274 "UmfrageOSN" "interview" 1 T F F T F F 2 2 1 3 360 F F F T F F F F F 5 2 27 F 1 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

275 "UmfrageOSN" "interview" 1 T F F F T F 1 4 2 5 1800 F T F T T T F F F 1 1 2 2 2 2 1 5 1 45 F "1" T 11 0 0 0 0 100 100

276 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 5 200 F F F T F F F F F 4 2 26 F 1 1 1 2 1 1 1 "3" T 11 0 0 0 0 100 100

278 "UmfrageOSN" "interview" 1 T F F F T F 4 4 4 5 420 F T F T F T F F F 5 2 26 F 2 2 2 2 2 "2" T 11 0 0 0 0 100 100

279 "UmfrageOSN" "interview" 1 T F F F T F 3 2 3 4 270 F T F T T F F F F 3 1 20 F 2 2 2 2 "3" T 11 0 0 0 0 100 100

280 "UmfrageOSN" "interview" 1 F T F F T F 3 4 5 450 F T F T F T F T F 3 1 40 F 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

281 "UmfrageOSN" "interview" 1 T F F F F F 1 3 2 4 250 F T T F T F F F F 2 2 1 1 1 1 1 3 2 19 F "1" T 11 0 0 0 0 100 100

282 "UmfrageOSN" "interview" 1 F F F T F F 2 2 2 5 300 F T T F F F F F F 6 2 14 F 1 1 2 1 1 1 1 "2" T 11 0 0 0 0 100 100

286 "UmfrageOSN" "interview" 1 T F F F T F 2 1 1 2 200 F F F T F T F F F 5 1 T 1 2 2 2 2 "4" T 11 0 0 0 0 100 100

291 "UmfrageOSN" "interview" 1 T F F T F F 2 4 4 3 270 F T F F T F F F F 3 1 23 F 2 2 2 2 "3" T 11 0 0 0 0 100 100

295 "UmfrageOSN" "interview" 1 T F F F T F 2 2 2 5 150 F T F T T T F F F 4 1 55 F 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

297 "UmfrageOSN" "interview" 1 T F F F T F 2 1 2 3 190 F T F F T F F F F 2 1 20 F 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

299 "UmfrageOSN" "interview" 1 F F T F T F 2 2 2 5 280 F T T T F F F F F 2 2 2 2 1 6 2 16 F "1" T 11 0 0 0 0 100 100

301 "UmfrageOSN" "interview" 1 T F F F T F 2 4 1 5 550 F T T F T T F F F 3 1 21 F 1 1 2 2 2 2 2 "4" T 11 0 0 0 0 100 100

305 "UmfrageOSN" "interview" 2 T 3 0 0 0 0 100 100

308 "UmfrageOSN" "interview" 1 T F F T F T 3 3 3 4 80 F T T T F T F F F 3 2 18 F 2 2 2 1 1 "3" T 11 0 0 0 0 100 100

309 "UmfrageOSN" "interview" 1 T F F T F F 2 1 2 5 120 F F F T F F F F F 1 1 2 2 2 1 1 5 1 58 F "1" T 11 0 0 0 0 100 100

310 "UmfrageOSN" "interview" 1 T F F F T F 1 2 4 100 F F F T F F F F F 3 1 30 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

312 "UmfrageOSN" "interview" 1 T F F F T F 1 1 2 5 1000 F F T T F T F F F 3 2 21 F 2 1 1 2 1 1 2 "2" T 11 0 0 0 0 100 100

313 "UmfrageOSN" "interview" 1 T F F T T F 3 3 3 4 140 F F T T T F F F F 1 1 2 5 2 34 F "1" T 11 0 0 0 0 100 100

315 "UmfrageOSN" "interview" 1 T F F F T F 2 2 4 150 F F F T F F F F F 4 1 49 F 1 1 2 1 1 1 "2" T 11 0 0 0 0 100 100

316 "UmfrageOSN" "interview" 1 T F F F T F 2 3 2 5 1350 F T F T T T F F F 3 2 22 F 2 1 2 2 2 2 1 "4" T 11 0 0 0 0 100 100

317 "UmfrageOSN" "interview" 1 T F F T F F 3 3 2 4 210 F T F T T T T F F 3 2 75 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

318 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 4 70 F T F T F F F F F 5 3 28 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

319 "UmfrageOSN" "interview" 1 T F F F T F 3 4 5 190 F T F F F F F F F 5 1 33 F 2 "3" T 11 0 0 0 0 100 100

322 "UmfrageOSN" "interview" 1 F F F F F F 4 4 50 F T F T F F F F F 2 1 62 F 1 1 1 1 "2" T 11 0 0 0 0 100 100

323 "UmfrageOSN" "interview" 1 T F F F T F 3 4 3 5 300 F T F T F F F F F 5 2 28 F 2 "4" T 11 0 0 0 0 100 100

325 "UmfrageOSN" "interview" 1 T F F F F T 500 F T F T F T F F F 2 2 T "1" T 11 0 0 0 0 100 100

326 "UmfrageOSN" "interview" 1 T F F F T F 3 2 3 3 150 F T F T F F F F F 6 1 15 F 2 1 1 2 1 2 "2" T 11 0 0 0 0 100 100

327 "UmfrageOSN" "interview" 2 T 3 0 0 0 0 100 100

331 "UmfrageOSN" "interview" 1 T F F T F F 3 2 2 4 200 F T F T F F F F F 5 2 50 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

334 "UmfrageOSN" "interview" 1 T F F T F F 1 2 1 2 50 F F F F F F F F T 5 2 T 1 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

335 "UmfrageOSN" "interview" 1 T F F F T F 2 2 3 4 350 F F F F F F F T F 5 1 25 F 1 2 2 2 "4" T 11 0 0 0 0 100 100

337 "UmfrageOSN" "interview" 1 T F F F T F 2 1 2 4 400 F T F T F T F F F 1 1 2 2 1 1 1 4 2 27 F "1" T 11 0 0 0 0 100 100

339 "UmfrageOSN" "interview" 1 T F F F T F 2 2 2 5 450 F T T T T T F F F 3 2 18 F 2 2 2 2 2 2 "4" T 11 0 0 0 0 100 100

340 "UmfrageOSN" "interview" 1 T F F F F T 2 2 3 350 F F F T T T F F F 3 2 45 F 1 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

342 "UmfrageOSN" "interview" 1 F T F F T F 3 2 4 5 400 F F T T F F F F F 2 2 2 2 2 2 2 5 2 38 F "1" T 11 0 0 0 0 100 100

343 "UmfrageOSN" "interview" 1 T F F T F F 1 2 1 5 150 F T F F F F F F F 6 2 18 F 1 1 2 2 2 2 2 "3" T 11 0 0 0 0 100 100

350 "UmfrageOSN" "interview" 1 T F F T F F 3 3 3 4 280 F F T T F F F F F 3 2 23 F 1 1 "4" T 11 0 0 0 0 100 100

351 "UmfrageOSN" "interview" 1 T F F F T F 3 3 2 5 153 F T F T T T T F F 1 1 1 1 1 1 1 3 2 24 F "1" T 11 0 0 0 0 100 100

352 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 5 1500 F F T T T T F F F 4 1 29 F 1 1 1 1 "2" T 11 0 0 0 0 100 100

356 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 230 F T T T F F F F F 3 2 18 F 2 1 1 1 2 2 1 "4" T 11 0 0 0 0 100 100

357 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 2 400 F T T T T F F F F 3 2 20 F 1 1 2 1 1 1 "2" T 11 0 0 0 0 100 100

Appendix v

359 "UmfrageOSN" "interview" 1 T F F T F F 1 2 1 4 500 F F T T T T F F F 1 1 1 1 1 1 1 3 1 20 F "1" T 11 0 0 0 0 100 100

360 "UmfrageOSN" "interview" 1 T F F T F F 1 3 2 5 1100 F T T T T T F T F 4 1 25 F 2 2 "3" T 11 0 0 0 0 100 100

365 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 4 120 F T F T T T F F F 1 2 1 1 3 2 21 F "1" T 11 0 0 0 0 100 100

366 "UmfrageOSN" "interview" 1 T F F T F F 3 2 2 5 700 F T F T T T F F F 3 1 28 F 1 1 1 1 1 2 "3" T 11 0 0 0 0 100 100

369 "UmfrageOSN" "interview" 1 F T F F F F 5 5 5 80 F F F T F F F F F 5 3 T 2 2 2 2 "2" T 11 0 0 0 0 100 100

375 "UmfrageOSN" "interview" 1 T F F T F F 2 3 2 2 20 F F F T F F F F F 2 2 47 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

376 "UmfrageOSN" "interview" 1 T F F T F F 3 2 2 5 1500 F F T T T T F F F 5 1 33 F 1 1 2 2 2 2 1 "2" T 11 0 0 0 0 100 100

378 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 2 180 F F F T F F F F T 5 2 30 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

379 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 90 F T F F F F F F F 5 2 67 F 1 1 "3" T 11 0 0 0 0 100 100

380 "UmfrageOSN" "interview" 1 F T F F T F 3 2 2 5 180 F T F T F T F F F 1 1 1 1 1 1 2 2 48 F "1" T 11 0 0 0 0 100 100

382 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 1 100 F T F T T T F F F 3 1 41 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

383 "UmfrageOSN" "interview" 1 T F F F T F 2 2 4 4 20 F F F T F F F F F 2 1 46 F 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

384 "UmfrageOSN" "interview" 1 T F F T F F 3 3 2 3 380 F T F F T F F F F 1 2 1 1 4 1 24 F "1" T 11 0 0 0 0 100 100

386 "UmfrageOSN" "interview" 1 T F F T F F 3 3 2 5 140 F T T T F T F F F 5 1 23 F 2 2 1 1 2 "3" T 11 0 0 0 0 100 100

387 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 4 200 F T F F T F F F F 3 1 20 F 1 1 1 1 "4" T 11 0 0 0 0 100 100

389 "UmfrageOSN" "interview" 1 T F F F F T 1 1 1 5 500 F T F T F T F F F 3 1 19 F 1 1 1 1 1 1 2 "2" T 11 0 0 0 0 100 100

390 "UmfrageOSN" "interview" 1 T F F F T F 2 2 1 5 370 F F T F F F F F F 5 1 25 F 1 2 2 1 1 1 1 "2" T 11 0 0 0 0 100 100

392 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 5 650 F T T T T T T F F 3 2 25 F 2 2 2 1 1 1 "4" T 11 0 0 0 0 100 100

393 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 5 600 F F F T F F F F F 5 2 24 F 2 1 1 2 2 1 2 "3" T 11 0 0 0 0 100 100

394 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 3 200 F T T F F F F F F 1 1 1 1 1 1 3 2 26 F "1" T 11 0 0 0 0 100 100

396 "UmfrageOSN" "interview" 1 T F F T F F 3 2 2 4 130 F T F T F T F F F 4 2 25 F 2 1 1 2 2 1 1 "2" T 11 0 0 0 0 100 100

397 "UmfrageOSN" "interview" 1 T F F T F F 3 2 3 5 1030 F T T T T T F F F 3 2 43 F 1 1 1 2 1 "4" T 11 0 0 0 0 100 100

398 "UmfrageOSN" "interview" 1 T F F T F F 3 4 2 4 45 F T T T T T T F F 2 2 2 3 2 34 F "1" T 11 0 0 0 0 100 100

399 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 5 206 F F F T F T F F F 3 2 43 F 1 1 2 2 2 2 2 "3" T 11 0 0 0 0 100 100

401 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 5 210 F T T T T T F F F 2 1 2 1 2 2 1 3 2 23 F "1" T 11 0 0 0 0 100 100

409 "UmfrageOSN" "interview" 1 T F F T F F 2 1 1 5 60 F F F T F F F F F 5 1 40 F 1 1 1 2 1 1 1 "2" T 11 0 0 0 0 100 100

410 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 400 F T T T T T F F F 2 1 1 5 2 26 F "1" T 11 0 0 0 0 100 100

414 "UmfrageOSN" "interview" 1 T F F T F F 3 3 2 4 1200 F T F T T T T F F 5 2 41 F 1 "4" T 11 0 0 0 0 100 100

416 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 5 400 F T T T T T F F F 3 2 23 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

418 "UmfrageOSN" "interview" 1 T F F T F F 4 4 5 5 1400 F T T T T T F F F 4 2 21 F 2 2 2 2 2 2 2 "4" T 11 0 0 0 0 100 100

422 "UmfrageOSN" "interview" 1 T F F F T F 2 3 3 4 515 F T T T T F F F F 1 1 2 2 2 2 1 1 1 17 F "1" T 11 0 0 0 0 100 100

424 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 5 300 F T T T F F F F F 4 2 23 F 2 2 1 2 2 1 2 "4" T 11 0 0 0 0 100 100

425 "UmfrageOSN" "interview" 1 T F F T F F 4 4 3 4 120 F F F T T T F F F 3 1 50 F 1 1 2 2 2 1 1 "3" T 11 0 0 0 0 100 100

428 "UmfrageOSN" "interview" 1 T F F T F F 3 3 3 4 250 F F F T T T T F F 5 2 40 F 1 1 2 2 2 2 "2" T 11 0 0 0 0 100 100

430 "UmfrageOSN" "interview" 1 T F F F T F 1 2 1 4 300 F T T T T T F F F 1 1 1 1 1 3 2 19 F "1" T 11 0 0 0 0 100 100

431 "UmfrageOSN" "interview" 1 T F F T F F 3 3 2 4 390 F T T T F T F F F 5 1 25 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

432 "UmfrageOSN" "interview" 1 T F F F T F 1 1 2 2 120 F F F T F T F F F 3 1 22 F 1 1 2 2 1 1 "4" T 11 0 0 0 0 100 100

433 "UmfrageOSN" "interview" 1 T F F F T F 5 5 5 3 100 F F F T F F F F F 5 2 28 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

435 "UmfrageOSN" "interview" 1 F F T F T F 2 1 3 T T F T F F F F F 1 1 1 1 1 3 2 T "1" T 11 0 0 0 0 100 100

436 "UmfrageOSN" "interview" 1 T F F T F F 4 4 2 5 480 F T T T F T F F F 3 2 21 F 1 1 2 2 2 1 1 "2" T 11 0 0 0 0 100 100

437 "UmfrageOSN" "interview" 1 T F F F T F 1 1 2 2 200 F F F T F T F F F 3 2 40 F 1 1 2 1 1 1 2 "4" T 11 0 0 0 0 100 100

442 "UmfrageOSN" "interview" 1 F F F T F F 1 1 1 2 50 F T F T F F F F F 3 1 21 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

444 "UmfrageOSN" "interview" 1 T F F T F F 2 1 2 4 80 F T T F T F F F F 3 2 24 F 2 2 1 "2" T 11 0 0 0 0 100 100

448 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 3 255 F T F F F F F F F 1 1 1 1 1 1 1 3 1 22 F "1" T 11 0 0 0 0 100 100

451 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 2 38 F F F F F F F T F 5 2 34 F 1 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

453 "UmfrageOSN" "interview" 1 T F F T F F 3 3 3 5 70 F F F T F T F F F 4 1 36 F 1 1 1 2 "4" T 11 0 0 0 0 100 100

455 "UmfrageOSN" "interview" 1 T F F T F F 2 1 2 4 700 F F F T T F F F F 5 2 40 F 1 1 2 1 1 1 1 "3" T 11 0 0 0 0 100 100

456 "UmfrageOSN" "interview" 1 T F F F T F 4 4 3 150 F T F T F T F F F 1 1 1 1 1 6 1 60 F "1" T 11 0 0 0 0 100 100

458 "UmfrageOSN" "interview" 2 T 3 0 0 0 0 100 100

459 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 5 650 F F T T F F F F F 3 1 21 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

460 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 2 120 F F F F F F F F T 2 2 28 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

465 "UmfrageOSN" "interview" 1 T F F F T F 2 1 1 150 F F F T T F F F F 1 1 1 1 1 1 2 3 1 26 F "1" T 11 0 0 0 0 100 100

466 "UmfrageOSN" "interview" 1 F T F F T F 1 2 2 150 F F F T F F F F F 5 2 45 F 2 2 2 1 2 2 2 "2" T 11 0 0 0 0 100 100

469 "UmfrageOSN" "interview" 1 T F F T F F 3 2 2 5 T T T T F T F F F 2 2 27 F 1 1 1 2 1 1 1 "2" T 11 0 0 0 0 100 100

473 "UmfrageOSN" "interview" 1 T F F F F T 1 1 1 4 350 F T F F F F F F F 3 2 22 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

474 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 4 400 F T F T T F F F F 3 1 20 F 2 1 1 2 2 2 1 "4" T 11 0 0 0 0 100 100

476 "UmfrageOSN" "interview" 1 F T F F T F 1 1 1 1 9 F F F T F F F F F 1 1 1 1 1 1 2 1 53 F "1" T 11 0 0 0 0 100 100

479 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 4 50 F F T T T T F F F 5 2 36 F 1 1 2 2 2 1 1 "3" T 11 0 0 0 0 100 100

480 "UmfrageOSN" "interview" 1 T F F F T F 2 2 2 4 250 F T F T F F F F F 3 1 19 F 2 1 2 1 2 1 2 "4" T 11 0 0 0 0 100 100

483 "UmfrageOSN" "interview" 1 T F F F T F 1 5 250 F T T T F F F F F 2 2 2 2 2 2 2 3 2 24 F "1" T 11 0 0 0 0 100 100

487 "UmfrageOSN" "interview" 1 T F F F T F 1 2 1 4 250 F T F T F F F F F 3 1 18 F 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

488 "UmfrageOSN" "interview" 1 T F F T F F 2 3 2 5 270 F T F T T T F F F 5 1 29 F 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

489 "UmfrageOSN" "interview" 1 T F F F T F 3 2 2 5 500 F T T T F F F F F 5 2 26 F 2 2 2 2 2 1 "4" T 11 0 0 0 0 100 100

490 "UmfrageOSN" "interview" 1 T F F T F F 2 1 1 4 300 F T T T F F F F F 4 2 25 F 1 1 1 2 2 2 2 "2" T 11 0 0 0 0 100 100

491 "UmfrageOSN" "interview" 1 T F F F T F 3 2 2 5 400 F T F F F F F F F 4 2 27 F 1 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

492 "UmfrageOSN" "interview" 1 T F F F F T 2 2 1 5 450 F F T T T T F F F 1 1 1 1 1 1 1 3 2 27 F "1" T 11 0 0 0 0 100 100

495 "UmfrageOSN" "interview" 1 T F F F F T 4 4 4 5 400 F T F F T F F F F 4 2 25 F 1 1 1 1 "3" T 11 0 0 0 0 100 100

496 "UmfrageOSN" "interview" 1 T F F F T F 2 1 2 5 600 F T F T F T F F F 5 2 32 F 2 1 1 2 2 1 1 "4" T 11 0 0 0 0 100 100

497 "UmfrageOSN" "interview" 1 T F F T F F 1 2 2 5 2000 F T F T T T F F F 3 2 35 F 1 1 2 1 2 1 1 "2" T 11 0 0 0 0 100 100

499 "UmfrageOSN" "interview" 1 T F F T F F 1 2 1 5 180 F T F T F T F T F 2 2 2 2 2 2 2 5 1 27 F "1" T 11 0 0 0 0 100 100

500 "UmfrageOSN" "interview" 1 T F F F T F 4 4 4 5 250 F T T T T F F F F 4 2 26 F 2 2 2 2 2 2 2 "3" T 11 0 0 0 0 100 100

501 "UmfrageOSN" "interview" 1 T F F F T F 3 2 2 5 400 F T T T T T F F F 4 1 23 F 2 2 2 2 2 2 "4" T 11 0 0 0 0 100 100

502 "UmfrageOSN" "interview" 1 T F F T F F 2 2 3 5 500 F T T T F T F F F 3 2 22 F 2 2 2 2 2 2 2 "3" T 11 0 0 0 0 100 100

504 "UmfrageOSN" "interview" 1 F F F F T F 2 2 3 5 1500 F F T T T T F F F 4 1 41 F 1 1 2 1 1 1 1 "2" T 11 0 0 0 0 100 100

507 "UmfrageOSN" "interview" 1 T F F T F F 2 2 1 5 350 F T F F T F F F F 4 1 22 F 2 1 2 1 1 "4" T 11 0 0 0 0 100 100

509 "UmfrageOSN" "interview" 1 T F F T F F 3 4 2 5 750 F T F T T T T F F 2 1 2 2 2 2 1 5 1 33 F "1" T 11 0 0 0 0 100 100

510 "UmfrageOSN" "interview" 1 T F F F T F 4 3 3 5 400 F T T T T T F F F 1 1 2 2 2 3 2 20 F "1" T 11 0 0 0 0 100 100

511 "UmfrageOSN" "interview" 1 T F F T F F 1 3 1 5 250 F T T T T T F F F 4 2 28 F 1 1 2 2 2 2 "3" T 11 0 0 0 0 100 100

512 "UmfrageOSN" "interview" 1 T F F F T F 3 4 2 5 80 F F T T F F F F F 3 2 20 F 2 2 2 2 1 1 1 "4" T 11 0 0 0 0 100 100

513 "UmfrageOSN" "interview" 1 T F F F T F 2 2 3 5 850 F T F T T F F F F 2 2 31 F 1 1 1 1 1 1 2 "2" T 11 0 0 0 0 100 100

517 "UmfrageOSN" "interview" 1 T F F T F F 3 2 2 4 666 F T F T T F F F F 1 1 2 1 2 1 1 4 1 26 F "1" T 11 0 0 0 0 100 100

519 "UmfrageOSN" "interview" 1 F F T T F F 3 2 3 5 130 F T F T F F F F F 3 2 22 F 2 1 1 2 1 1 1 "4" T 11 0 0 0 0 100 100

520 "UmfrageOSN" "interview" 1 T F F T F F 1 3 3 5 150 F T F F T F F F F 4 2 25 F 1 1 2 2 2 2 2 "2" T 11 0 0 0 0 100 100

522 "UmfrageOSN" "interview" 1 T F F F T F 1 2 2 5 300 F T F T T T F F F 2 1 2 2 2 2 1 3 1 19 F "1" T 11 0 0 0 0 100 100

523 "UmfrageOSN" "interview" 1 F T F F T F 3 2 3 5 500 F T T T T T T F F 6 1 14 F 2 2 2 2 2 2 1 "3" T 11 0 0 0 0 100 100

524 "UmfrageOSN" "interview" 1 T F F F T F 3 2 2 2 350 F F T T F T F F F 2 2 20 F 2 2 2 2 2 2 2 "2" T 11 0 0 0 0 100 100

526 "UmfrageOSN" "interview" 1 F T F F T F 3 3 2 5 300 F T T T T T T F F 6 1 16 F 2 2 2 2 2 2 1 "4" T 11 0 0 0 0 100 100

527 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 5 1500 F T T T T T F F F 5 1 25 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

528 "UmfrageOSN" "interview" 1 T F F F T F 2 2 150 F T T F F F F T F 2 2 2 2 2 2 2 5 2 26 F "1" T 11 0 0 0 0 100 100

vi Appendix

529 "UmfrageOSN" "interview" 1 T F F F T F 2 2 3 5 600 F T F F T F F F F 4 2 24 F 1 1 2 1 1 1 2 "2" T 11 0 0 0 0 100 100

530 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 5 1500 F T T T T T F F F 5 1 30 F 2 2 2 2 2 2 2 "4" T 11 0 0 0 0 100 100

531 "UmfrageOSN" "interview" 1 T F F F T F 2 2 2 5 150 F F F F F F F F T 4 2 26 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

533 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 5 200 F T T T F T F F F 4 1 23 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

536 "UmfrageOSN" "interview" 1 T F F T F F 2 4 2 5 600 F T T T F T F F F 4 1 25 F 1 1 1 1 1 2 "2" T 11 0 0 0 0 100 100

537 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 3 160 F F F F F F F F T 1 1 1 1 1 4 1 26 F "1" T 11 0 0 0 0 100 100

541 "UmfrageOSN" "interview" 1 F T F F T F 2 2 2 3 300 F T T T F T F F F 4 1 49 F 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

543 "UmfrageOSN" "interview" 1 T F F T F F 3 3 3 5 300 F F T T T T F F F 1 1 1 1 2 2 2 47 F "1" T 11 0 0 0 0 100 100

544 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 3 100 F F F T T T F F F 6 3 50 F 2 2 2 2 2 2 1 "4" T 11 0 0 0 0 100 100

546 "UmfrageOSN" "interview" 1 F T F T F F 2 2 1 3 80 F T F F F T F F F 5 1 59 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

547 "UmfrageOSN" "interview" 1 T F F F T F 2 2 1 5 600 F F F T T T F F F 2 2 2 2 2 2 1 5 1 38 F "1" T 11 0 0 0 0 100 100

548 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 440 F F F T T T F F F 3 1 48 F 1 1 2 2 2 2 2 "2" T 11 0 0 0 0 100 100

550 "UmfrageOSN" "interview" 1 T F F T F F 2 3 3 2 169 F F F T F F F T F 2 1 62 F 1 1 2 2 2 "4" T 11 0 0 0 0 100 100

554 "UmfrageOSN" "interview" 1 T F F F F T 2 2 1 5 483 F T T T F T F T F 3 1 23 F 1 1 1 1 2 "3" T 11 0 0 0 0 100 100

555 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 5 570 F T T T T T F F F 3 1 23 F 2 2 2 2 2 "2" T 11 0 0 0 0 100 100

556 "UmfrageOSN" "interview" 1 T F F T F F 4 3 2 3 66 F F F T T F F F F 3 1 69 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

562 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 5 500 F T F T T F F F F 1 1 1 1 1 1 2 3 1 19 F "1" T 11 0 0 0 0 100 100

563 "UmfrageOSN" "interview" 1 T F F F T F 1 2 4 220 F F F T T T F F F 5 1 34 F 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

564 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 5 200 F T F T F T T T F 2 1 32 F 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

565 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 150 F F F T T T F F F 1 2 60 F 1 1 1 "2" T 11 0 0 0 0 100 100

566 "UmfrageOSN" "interview" 1 T F F F T F 2 3 2 5 400 F T T T F T T F F 3 2 23 F 2 2 1 2 2 2 "4" T 11 0 0 0 0 100 100

568 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 4 300 F T F T F F F F F 2 1 2 1 1 1 1 3 2 20 F "1" T 11 0 0 0 0 100 100

569 "UmfrageOSN" "interview" 1 T F F F T F 4 3 4 4 700 F T T T T T F F F 5 2 40 F 2 2 2 "3" T 11 0 0 0 0 100 100

570 "UmfrageOSN" "interview" 1 T F F T F F 1 2 2 4 120 F F F F F F F F T 1 1 1 1 1 1 1 3 2 50 F "1" T 11 0 0 0 0 100 100

572 "UmfrageOSN" "interview" 1 T F F T F F 2 3 2 4 T F F T T T F F F 6 3 T 2 "4" T 11 0 0 0 0 100 100

574 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 250 F F T F F F F F F 3 1 18 F 2 2 2 1 2 1 2 "2" T 11 0 0 0 0 100 100

575 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 3 600 F T T T T T F F F 3 2 25 F 1 2 2 2 "4" T 11 0 0 0 0 100 100

576 "UmfrageOSN" "interview" 2 T 3 0 0 0 0 100 100

579 "UmfrageOSN" "interview" 1 T F F F T F 2 3 2 4 300 F F F T T T T F F 3 1 51 F 1 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

583 "UmfrageOSN" "interview" 1 T F F T F F 2 3 2 4 245 F T F T T T F F F 3 3 26 F 1 2 2 2 1 1 "2" T 11 0 0 0 0 100 100

585 "UmfrageOSN" "interview" 1 T F F F T F 2 2 1 5 250 F T T T F F F F F 2 2 1 2 1 1 1 4 2 23 F "1" T 11 0 0 0 0 100 100

588 "UmfrageOSN" "interview" 1 T F F F T F 2 2 2 5 150 F T T T T F F F F 4 2 23 F 1 1 2 2 1 1 1 "3" T 11 0 0 0 0 100 100

589 "UmfrageOSN" "interview" 1 T F F F T F 3 2 2 5 350 F T F T T F F F F 3 1 18 F 1 1 2 1 1 1 1 "2" T 11 0 0 0 0 100 100

595 "UmfrageOSN" "interview" 1 T F F F T F 1 1 1 3 180 F T F T T F F F F 3 2 23 F 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

596 "UmfrageOSN" "interview" 1 T F F F T F 1 3 2 2 50 F T F T F F F F F 2 1 17 F 1 1 2 2 1 2 "4" T 11 0 0 0 0 100 100

599 "UmfrageOSN" "interview" 1 T F F T F F 2 2 3 5 350 F T F T T T F T F 3 1 31 F 1 1 2 2 2 2 "3" T 11 0 0 0 0 100 100

600 "UmfrageOSN" "interview" 1 T F F T F F 3 2 2 3 200 F F T T T T F T F 1 1 1 1 1 1 1 5 1 45 F "1" T 11 0 0 0 0 100 100

604 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 5 156 F T F T F F F F F 1 1 1 1 1 5 1 28 F "1" T 11 0 0 0 0 100 100

605 "UmfrageOSN" "interview" 1 T F F T F F 4 2 3 4 400 F T F T F T F F F 3 1 20 F 2 2 2 2 2 "4" T 11 0 0 0 0 100 100

608 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 5 1300 F T T T T T F T F 1 1 2 2 2 1 2 5 2 25 F "1" T 11 0 0 0 0 100 100

609 "UmfrageOSN" "interview" 1 T F F F T F 2 2 2 5 700 F T F T F T F F F 3 2 22 F 1 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

610 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 134 F F F T F F F F F 3 2 27 F 2 1 2 2 2 2 1 "4" T 11 0 0 0 0 100 100

611 "UmfrageOSN" "interview" 1 F F F T F F 2 3 2 5 700 F F T T T F F F F 3 1 23 F 1 1 2 2 "3" T 11 0 0 0 0 100 100

612 "UmfrageOSN" "interview" 1 T F F F T F 2 4 2 2 467 F F F F T F F F F 1 1 1 1 1 1 1 5 2 29 F "1" T 11 0 0 0 0 100 100

614 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 4 398 F F F T T F F F F 5 1 26 F 1 1 2 2 1 1 1 "4" T 11 0 0 0 0 100 100

615 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 200 F T T T T T T F F 4 2 35 F 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

616 "UmfrageOSN" "interview" 1 T F F T F F 2 3 1 5 450 F T T T T F F F F 3 2 20 F 1 1 2 2 2 2 "3" T 11 0 0 0 0 100 100

618 "UmfrageOSN" "interview" 1 F T F F F T 1 2 1 4 200 F T F F F F F F F 1 1 3 1 23 F "1" T 11 0 0 0 0 100 100

619 "UmfrageOSN" "interview" 1 T F F T F F 2 3 3 4 300 F T F T T T F T F 4 1 26 F 1 1 2 2 1 1 1 "3" T 11 0 0 0 0 100 100

624 "UmfrageOSN" "interview" 1 T F F T F F 1 3 1 5 2000 F T T T T T F F F 2 2 2 2 2 2 1 3 2 20 F "1" T 11 0 0 0 0 100 100

628 "UmfrageOSN" "interview" 1 T F F T F F 3 4 4 5 400 F F F T F T F F F 5 2 47 F 1 1 2 2 2 1 1 "4" T 11 0 0 0 0 100 100

632 "UmfrageOSN" "interview" 1 T F F T F F 4 3 3 4 350 F F F T F F F F F 3 2 25 F 1 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

633 "UmfrageOSN" "interview" 1 T F F F F T 2 2 2 4 300 F F F T T T T T F 2 2 27 F 1 1 2 2 1 1 2 "3" T 11 0 0 0 0 100 100

634 "UmfrageOSN" "interview" 1 T F F T F F 3 3 3 4 217 F F F T F F F F F 5 2 30 F 1 1 1 2 1 1 1 "2" T 11 0 0 0 0 100 100

635 "UmfrageOSN" "interview" 1 F T F F T F 3 3 3 5 500 F T T T T T T F F 2 2 2 2 2 2 1 6 2 14 F "1" T 11 0 0 0 0 100 100

636 "UmfrageOSN" "interview" 1 T F F F T F 3 2 2 4 450 F T F T T T T F F 6 1 16 F 2 1 2 2 1 1 1 "4" T 11 0 0 0 0 100 100

637 "UmfrageOSN" "interview" 1 T F F T F F 2 2 1 2 175 F T F T T F F F F 2 1 2 1 1 1 1 3 1 20 F "1" T 11 0 0 0 0 100 100

645 "UmfrageOSN" "interview" 1 T F F F T F 3 4 3 5 700 F T T T T T T T F 6 1 15 F 2 1 2 2 2 "2" T 11 0 0 0 0 100 100

646 "UmfrageOSN" "interview" 1 T F F F F T 2 4 3 5 500 F T T T T T T T F 2 1 2 2 2 2 6 2 14 F "1" T 11 0 0 0 0 100 100

647 "UmfrageOSN" "interview" 1 T F F F T F 2 4 400 F T T T T T T T F 2 1 2 2 2 2 2 2 2 17 F "1" T 11 0 0 0 0 100 100

648 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 150 F T T T T T T F F 3 1 18 F 2 1 2 1 1 1 1 "4" T 11 0 0 0 0 100 100

649 "UmfrageOSN" "interview" 1 T F F F F T 2 3 3 5 600 F T T T T T T T T 1 1 17 F 2 1 2 2 2 2 2 "2" T 11 0 0 0 0 100 100

650 "UmfrageOSN" "interview" 1 T F F F T F 1 3 1 4 300 F T F T T T T F F 2 2 20 F 2 1 1 1 1 1 1 "3" T 11 0 0 0 0 100 100

651 "UmfrageOSN" "interview" 1 T F F F F T 3 3 3 4 600 F T T T T T T F F 3 1 19 F 2 1 2 1 2 2 1 "4" T 11 0 0 0 0 100 100

652 "UmfrageOSN" "interview" 1 T F F F T F 3 3 3 4 500 F T T T T T T F F 2 2 2 2 2 2 1 3 2 18 F "1" T 11 0 0 0 0 100 100

653 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 2 125 F T F T F F F F F 3 1 19 F 2 1 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

654 "UmfrageOSN" "interview" 1 T F F F T F 1 3 1 4 340 F T F T T T T F F 3 1 18 F 2 1 2 2 1 1 1 "3" T 11 0 0 0 0 100 100

655 "UmfrageOSN" "interview" 1 T F F F T F 3 3 3 4 200 F T F T T T T F F 3 1 18 F 2 1 2 1 "4" T 11 0 0 0 0 100 100

656 "UmfrageOSN" "interview" 1 T F F F T F 3 3 3 5 360 F T F T T T T T F 3 1 17 F 2 2 1 "2" T 11 0 0 0 0 100 100

657 "UmfrageOSN" "interview" 1 T F F F T F 2 3 1 2 180 F T F T F T F F F 3 1 18 F 2 1 1 2 1 1 1 "3" T 11 0 0 0 0 100 100

659 "UmfrageOSN" "interview" 1 T F F T F F 2 3 3 5 140 F F F T T T F F F 5 1 28 F 1 1 2 2 2 2 "3" T 11 0 0 0 0 100 100

660 "UmfrageOSN" "interview" 1 F F T F T F 1 1 1 5 170 F T F T F F F F F 5 1 29 F 2 2 1 2 2 2 1 "4" T 11 0 0 0 0 100 100

663 "UmfrageOSN" "interview" 1 T F F T F F 3 1 1 3 750 F F F F F F F T F 3 2 27 F 1 1 2 2 2 2 "3" T 11 0 0 0 0 100 100

664 "UmfrageOSN" "interview" 1 T F F F F T 2 2 5 160 F T F F F F F F F 1 1 1 1 1 6 1 18 F "1" T 11 0 0 0 0 100 100

665 "UmfrageOSN" "interview" 1 T F F F T F 3 1 2 5 200 F T T T T F F F F 5 1 29 F 1 2 2 2 2 "4" T 11 0 0 0 0 100 100

666 "UmfrageOSN" "interview" 1 T F F F F T 2 3 2 4 200 F T F T T T T F F 3 1 18 F 2 1 2 2 1 "2" T 11 0 0 0 0 100 100

667 "UmfrageOSN" "interview" 1 T F F F T F 3 3 3 3 500 F T T T T T T F F 2 1 2 1 3 2 18 F "1" T 11 0 0 0 0 100 100

668 "UmfrageOSN" "interview" 1 T F F F T F 2 2 1 2 100 F T F T T T T F F 2 1 1 3 1 20 F "1" T 11 0 0 0 0 100 100

669 "UmfrageOSN" "interview" 1 T F F F T F 2 3 2 5 400 F T T T T T T T F 6 1 14 F 2 1 2 2 1 "3" T 11 0 0 0 0 100 100

670 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 3 400 F T T T T T T F F 3 1 18 F 2 1 2 1 1 "2" T 11 0 0 0 0 100 100

671 "UmfrageOSN" "interview" 1 T F F F T F 3 3 2 4 400 F T T T T T T T F 2 1 18 F 2 1 2 2 1 "4" T 11 0 0 0 0 100 100

672 "UmfrageOSN" "interview" 1 T F F F T F 3 2 1 3 370 F T F T T T F F F 3 1 17 F 2 1 2 1 1 1 "4" T 11 0 0 0 0 100 100

673 "UmfrageOSN" "interview" 1 T F F T F F 2 2 1 4 350 F T F T T T T F F 3 1 18 F 2 1 2 1 "3" T 11 0 0 0 0 100 100

674 "UmfrageOSN" "interview" 1 T F F F F T 3 2 2 4 460 F T F T T T T F F 2 1 2 2 2 2 1 2 16 F "1" T 11 0 0 0 0 100 100

675 "UmfrageOSN" "interview" 1 T F F F T F 3 3 5 600 F T T T T T T T F 2 2 16 F 2 2 2 2 "2" T 11 0 0 0 0 100 100

676 "UmfrageOSN" "interview" 1 T F F F T F 3 3 2 4 500 F T T T T T T T F 2 2 17 F 2 1 2 2 2 2 "4" T 11 0 0 0 0 100 100

677 "UmfrageOSN" "interview" 1 T F F F T F 3 3 3 5 400 F T T T T T T T F 1 1 16 F 2 2 2 2 2 1 "3" T 11 0 0 0 0 100 100

679 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 5 140 F T F T T T F F F 4 2 28 F 2 2 2 2 "2" T 11 0 0 0 0 100 100

680 "UmfrageOSN" "interview" 1 T F F T F F 2 1 2 4 200 F F T T F F F F F 5 2 28 F 1 1 1 2 1 1 1 "4" T 11 0 0 0 0 100 100

Appendix vii

681 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 2 194 F T F T F F F F F 1 1 1 1 1 1 1 5 2 39 F "1" T 11 0 0 0 0 100 100

682 "UmfrageOSN" "interview" 1 T F F F T F 4 4 1 5 1200 F T F T T T F F F 5 2 29 F 1 1 2 2 2 1 1 "3" T 11 0 0 0 0 100 100

683 "UmfrageOSN" "interview" 1 T F F F T F 1 2 2 5 350 F T F T T T F F F 3 1 23 F 1 2 2 2 2 1 "3" T 11 0 0 0 0 100 100

685 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 80 F F T T F T F F F 1 1 2 2 1 5 1 29 F "1" T 11 0 0 0 0 100 100

687 "UmfrageOSN" "interview" 1 T F F F T F 2 2 2 5 470 F T F T F F F F F 4 1 22 F 1 1 1 2 1 1 "4" T 11 0 0 0 0 100 100

690 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 450 F T F T T T F F F 5 2 42 F 1 1 2 2 2 2 2 "2" T 11 0 0 0 0 100 100

694 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 5 600 F T T T T T F F F 3 2 24 F 1 1 1 2 2 1 "3" T 11 0 0 0 0 100 100

695 "UmfrageOSN" "interview" 1 T F F T F F 2 1 1 2 120 F T F F F F F F F 1 1 1 1 1 1 2 4 2 27 F "1" T 11 0 0 0 0 100 100

696 "UmfrageOSN" "interview" 1 T F F T F F 2 3 2 5 215 F T F T T T F F F 3 1 22 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

697 "UmfrageOSN" "interview" 1 T F F T F F 2 2 2 4 600 F T F T F F F F F 4 2 23 F 1 1 2 1 1 1 1 "3" T 11 0 0 0 0 100 100

699 "UmfrageOSN" "interview" 1 T F F T F F 2 1 2 5 150 F T F T T F F F F 3 2 20 F 1 1 1 1 1 1 1 "4" T 11 0 0 0 0 100 100

710 "UmfrageOSN" "interview" 1 T F F F T F 1 2 370 F F F T F F F F F 3 1 T 1 1 2 2 2 1 1 "2" T 11 0 0 0 0 100 100

711 "UmfrageOSN" "interview" 1 T F F T F F 2 3 4 5 200 F F T T F T F F F 1 1 1 1 1 1 2 5 1 26 F "1" T 11 0 0 0 0 100 100

723 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 2 150 F F F F F F F F T 1 1 1 1 1 1 1 3 1 23 F "1" T 11 0 0 0 0 100 100

737 "UmfrageOSN" "interview" 1 T F F T F F 1 1 1 2 300 F F F T F T F F F 5 2 30 F 1 1 1 2 1 1 1 "4" T 11 0 0 0 0 100 100

743 "UmfrageOSN" "interview" 1 T F F T F F 3 3 2 5 280 F T F T F T F F F 5 2 47 F 1 1 1 1 1 "2" T 11 0 0 0 0 100 100

Table 21. Survey Raw Data.

R Script Data of Subchapter 3.1: # R Script for survey:

# Private Data as Payment Method in Social Network Services?

# by Claus-Georg Nolte

setwd("…")

library(psych)

library(car)

#respond rate

(386/640)*100

(320/640)*100

#general

#run r-data import script

#basics

table(data$A101)

table(data$D101)

table(data$D102)

#Korrekturen

data$age<-data$D103_01

data$fb.friends<-data$A302_01

mean(data$age,na.rm=T)

max(data$age,na.rm=T)

min(data$age,na.rm=T)

mean(data$fb.friends,na.rm=T)

#Daten sortieren

data$pr.opt<-NA #neue Variable f?r ge?nderte Privacy Optionen

data$pr.opt<-ifelse(data$A201_01=="TRUE",1,ifelse(data$A201_02=="TRUE",0,NA))

data$pr.rd<-NA #neue Variable f?r gelesene Datenverwendungsrichtlinien

data$pr.rd<-ifelse(data$A203_01=="TRUE",1,ifelse(data$A203_02=="TRUE",0,NA))

#Geschlecht in Nummern (1=Mann):

data$gender<-as.numeric(recode(data$D102, '"m?nnlich"=1;"weiblich"=2;"keine Angabe"=NA;'))

data$gen.male<-as.numeric(recode(data$gender, '1=1;2=0;'))

data$gen.female<-as.numeric(recode(data$gender, '1=0;2=1;'))

#Bildungsabschluss in Nummern:

data$educ<-as.numeric(recode(data$D101, '

"Haupt-(Volks-)schulabschluss"=1;

"Realschul- oder gleichwertiger Abschluss"=2;

"Fachhochschul- oder Hochschulreife"=3;

"Bachelor Abschluss"=4;

"Master oder gleichwertiger bzw. höherer Abschluss"=5;

"keine Angabe"=NA;' ))

#Je h?her Vertrauen in FB desto niedriger Variable

data$fb.trust1<-recode(data$A202_01, '1=5;2=4;3=3;4=2;5=1;-1="NA";-9="NA";')

data$fb.trust2<-recode(data$A202_03, '1=5;2=4;3=3;4=2;5=1;-1="NA";-9="NA";')

data$fb.trust3<-recode(data$A202_04, '1=5;2=4;3=3;4=2;5=1;-1="NA";-9="NA";')

#Neue Variable f?r Privacy Awareness:

data$fb.trust1.n<-data$fb.trust1/5

data$fb.trust2.n<-data$fb.trust2/5

data$fb.trust3.n<-data$fb.trust3/5

attach(data)

data$pr.awa<-rowMeans(data.frame(pr.opt,pr.rd,fb.trust1.n,fb.trust2.n,fb.trust3.n),na.rm=T)

detach(data)

mean(data$pr.awa,na.rm=T)

#Standardisieren:

data$pr.awa.z<-scale(data$pr.awa)

#Neue Variable f?r FB Nutzerverhalten:

data$fb.use<-recode(data$A301_01, '1=1;2=2;3=3;4=4;5=5;-1="NA";-9="NA";')

viii Appendix

#Neue Variablen f?r FB Funktionsnutzung (Ja=1, Nein=0):

data$func.chat<-recode(data$B101_01,'"TURE"=1;"FALSE"=0;')

data$func.messenger<-recode(data$B101_02,'"TURE"=1;"FALSE"=0;')

data$func.news<-recode(data$B101_03, '"TURE"=1;"FALSE"=0;')

data$func.events<-recode(data$B101_07, '"TURE"=1;"FALSE"=0;')

data$func.pics<-recode(data$B101_04, '"TURE"=1;"FALSE"=0;')

data$func.vids<-recode(data$B101_05, '"TURE"=1;"FALSE"=0;')

data$func.apps<-recode(data$B101_06, '"TURE"=1;"FALSE"=0;')

attach(data)

data$func.all<-rowSums(data.frame(func.chat,func.messenger,func.news,func.events,func.pics,func.vids,func.apps),na.rm=T)

detach(data)

#Neue Variable f?r Experiment-Gruppen Zugeh?rigkeit:

#Gruppe1=MND-x, Gruppe2=MND, Gruppe3=MND+x, Gruppe4=MND+x+y

data$exp.gr<-recode(data$G101_01, '1=2;2=3;3=4;4=1;')

#Neue Variablen f?r Gruppe2=MND (Ja=1, Nein=0):

data$gr2.func.chat<-recode(data$C101_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr2.func.messenger<-recode(data$C102_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr2.func.news<-recode(data$C104_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr2.func.events<-recode(data$C103_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr2.func.pics<-recode(data$C105_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr2.func.vids<-recode(data$C106_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr2.func.apps<-recode(data$C107_01,'1=0;2=1;-1="NA";-9="NA";')

attach(data)

data$gr2.func.all<-rowSums(data.frame(gr2.func.chat,gr2.func.messenger,gr2.func.news,gr2.func.events,gr2.func.pics,gr2.func.vids,gr2.func.apps),na.rm=T)

detach(data)

#Neue Variablen f?r Gruppe3=MND+x (Ja=1, Nein=0):

data$gr3.func.chat<-recode(data$C201_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr3.func.messenger<-recode(data$C202_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr3.func.news<-recode(data$C204_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr3.func.events<-recode(data$C203_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr3.func.pics<-recode(data$C205_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr3.func.vids<-recode(data$C206_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr3.func.apps<-recode(data$C207_01,'1=0;2=1;-1="NA";-9="NA";')

attach(data)

data$gr3.func.all<-rowSums(data.frame(gr3.func.chat,gr3.func.messenger,gr3.func.news,gr3.func.events,gr3.func.pics,gr3.func.vids,gr3.func.apps),na.rm=T)

detach(data)

#Neue Variablen f?r Gruppe4=MND+x+y (Ja=1, Nein=0):

data$gr4.func.chat<-recode(data$C301_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr4.func.messenger<-recode(data$C302_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr4.func.news<-recode(data$C304_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr4.func.events<-recode(data$C303_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr4.func.pics<-recode(data$C305_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr4.func.vids<-recode(data$C306_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr4.func.apps<-recode(data$C307_01,'1=0;2=1;-1="NA";-9="NA";')

attach(data)

data$gr4.func.all<-rowSums(data.frame(gr4.func.chat,gr4.func.messenger,gr4.func.news,gr4.func.events,gr4.func.pics,gr4.func.vids,gr4.func.apps),na.rm=T)

detach(data)

#Neue Variablen f?r Gruppe1=MND-x (Ja=1, Nein=0):

data$gr1.func.chat<-recode(data$C001_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr1.func.messenger<-recode(data$C002_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr1.func.news<-recode(data$C004_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr1.func.events<-recode(data$C003_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr1.func.pics<-recode(data$C005_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr1.func.vids<-recode(data$C006_01,'1=0;2=1;-1="NA";-9="NA";')

data$gr1.func.apps<-recode(data$C007_01,'1=0;2=1;-1="NA";-9="NA";')

attach(data)

data$gr1.func.all<-rowSums(data.frame(gr1.func.chat,gr1.func.messenger,gr1.func.news,gr1.func.events,gr1.func.pics,gr1.func.vids,gr1.func.apps),na.rm=T)

detach(data)

#Anzahl der neu gew?hlten Funktionen ohne R?cksicht auf Gruppenzugeh?rigkeit:

attach(data)

data$gr.func.all<-rowSums(data.frame(gr1.func.all,gr2.func.all,gr3.func.all,gr4.func.all),na.rm=T)

detach(data)

which(data$gr.func.all==8) #sollte nicht vorkommen

#Differenz zw. neu & alt gew?hlten Funktionen:

attach(data)

data$func.diff<-na.omit(gr.func.all-func.all)

detach(data)

#first insights

mean(data$func.diff)

mean(data$func.diff[which(data$exp.gr==1)])

mean(data$func.diff[which(data$exp.gr==2)])

mean(data$func.diff[which(data$exp.gr==3)])

mean(data$func.diff[which(data$exp.gr==4)])

mean(data$gr.func.all)

mean(data$gr.func.all[which(data$exp.gr==1)])

mean(data$gr.func.all[which(data$exp.gr==2)])

mean(data$gr.func.all[which(data$exp.gr==3)])

mean(data$gr.func.all[which(data$exp.gr==4)])

#descriptive statistik

attach(data)

cor.test(pr.awa,fb.use)

cor.test(pr.awa,func.all)

cor.test(pr.awa,fb.friends)

cor.test(pr.awa,educ)

cor.test(age,pr.awa)

cor.test(age,fb.friends)

cor.test(age,func.all)

cor.test(gen.male,pr.awa)

cor.test(gen.male,fb.use)

Appendix ix

cor.test(gen.female,fb.friends)

auswahl<-data.frame(exp.gr,age,gen.male,fb.friends,fb.use,pr.awa,educ,func.all,gr.func.all,func.diff)

d<-describe(auswahl)

c<-corr.test(auswahl)$r

t<-cbind(M=d$mean,SD=d$sd,c)

write.csv2(t,file="Deskriptive Stata OSN.csv")

detach(data)

#Grafiken

attach(data)

hist(age,breaks=80)

hist(pr.awa,breaks=50)

hist(fb.friends,breaks=50)

boxplot(age ~ exp.gr)

boxplot(pr.awa ~ exp.gr)

boxplot(fb.friends ~ exp.gr)

boxplot(fb.use ~ exp.gr)

boxplot(gen.female ~ exp.gr)

boxplot(gr.func.all ~ exp.gr)

boxplot(func.diff ~ exp.gr)

detach(data)

#Varianz-homogenität

attach(data)

leveneTest(age, exp.gr)

leveneTest(pr.awa, exp.gr)

leveneTest(fb.friends, exp.gr)

leveneTest(fb.use, exp.gr)

leveneTest(gen.male, exp.gr)

leveneTest(func.diff, exp.gr)

leveneTest(educ, exp.gr)

#T-Test

t.test(fb.friends ~ gen.male)

detach(data)

#Regression

attach(data)

plot(func.diff ~ exp.gr)

abline(lm(func.diff ~ exp.gr))

summary(lm(func.diff ~ exp.gr))

plot(func.diff ~ pr.awa)

abline(lm(func.diff ~ pr.awa))

summary(lm(func.diff ~ pr.awa))

plot(func.diff ~ gen.male)

abline(lm(func.diff ~ gen.male))

summary(lm(func.diff ~ gen.male))

summary(lm(func.diff ~ educ))

summary(lm(func.diff ~ exp.gr + pr.awa + gen.male + educ))

plot(gr.func.all ~ exp.gr)

abline(lm(gr.func.all ~ exp.gr))

summary(lm(gr.func.all ~ exp.gr))

plot(gr.func.all ~ pr.awa)

abline(lm(gr.func.all ~ pr.awa))

summary(lm(gr.func.all ~ pr.awa))

plot(gr.func.all ~ gen.male)

abline(lm(gr.func.all ~ gen.male))

summary(lm(gr.func.all ~ gen.male))

summary(lm(gr.func.all ~ fb.use))

summary(lm(gr.func.all ~ fb.friends))

summary(lm(gr.func.all ~ age))

summary(lm(gr.func.all ~ educ))

summary(lm(gr.func.all ~ exp.gr + pr.awa + gen.male + age + fb.use + fb.friends))

summary(lm(func.all ~ gen.male))

summary(lm(func.all ~ educ))

summary(lm(func.all ~ pr.awa + age + fb.use + fb.friends + educ))

detach(data)

Table 22. R Script Data.

x Appendix

Table 23. Mean, Standard Error and Correlations.

Figure 20. Histogram of Age.

Appendix xi

Figure 21. Histogram of FB Friends.

Figure 22. Histogram of Privacy Awareness.

xii Appendix

Figure 23. Distribution of Age through Experiment Groups.

Figure 24. Distribution of FB Friends through Experiment Groups.

Appendix xiii

Figure 25. Distribution of used FB Functions through Experiment Groups.

Figure 26. Distribution of Users' Privacy Awareness through Experiment Groups.

xiv Appendix

Figure 27. Distribution of Degree of FB Using through Experiment Groups.

Figure 28. Distribution of chosen Functions through Experiment Groups.

Appendix xv

age over exp.gr:

DF F value Pr(>F)

group 3 0.20 0.89

Table 24. Levene’s Test for Homogeneity of Variance (center = median) for the variable age over exp.gr.

pr.awa over exp.gr:

DF F value Pr(>F)

group 3 0.49 0.69

Table 25. Levene’s Test for Homogeneity of Variance (center = median) for the variable pr.awa over exp.gr.

fb.friends over exp.gr:

DF F value Pr(>F)

group 3 0.23 0.87

Table 26. Levene’s Test for Homogeneity of Variance (center = median) for the variable fb.friends over exp.gr.

fb.use over exp.gr:

DF F value Pr(>F)

group 3 0.53 0.66

Table 27. Levene’s Test for Homogeneity of Variance (center = median) for the variable fb.use over exp.gr.

func.diff over exp.gr:

DF F value Pr(>F)

group 3 0.46 0.71

Table 28. Levene’s Test for Homogeneity of Variance (center = median) for the variable func.diff over exp.gr.

educ over exp.gr:

DF F value Pr(>F)

group 3 0.71 0.55

Table 29. Levene’s Test for Homogeneity of Variance (center = median) for the variable educ over exp.gr.

xvi Appendix

𝑓𝑢𝑛𝑐. 𝑑𝑖𝑓𝑓𝑖 = 𝛽0 + 𝛽1 ∗ 𝑒𝑥𝑝. 𝑔𝑟𝑖

Equation 8. Bivariate Regression for func.diffi with exp.gri as explanatory Variable.

Coefficients Estimate Std. Error t value Pr(>|t|)

(Intercept) -0.45 0.31 -1.41 0.16

exp.gr -0.24 0.12 -2.08 0.04 *

Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 2.24 on 307 degrees of freedom

(11 observations deleted due to missingness)

Multiple R-squared: 0.01, Adjusted R-squared: 0.01

Table 30. Bivariate Regression for func.diffi with exp.gri as explanatory Variable.

Figure 29. Bivariate Regression for func.diffi with exp.gri as explanatory Variable.

Appendix xvii

𝑓𝑢𝑛𝑐. 𝑑𝑖𝑓𝑓𝑖 = 𝛽0 + 𝛽1 ∗ 𝑝𝑟. 𝑎𝑤𝑎𝑖

Equation 9. Bivariate Regression for func.diffi with pr.awai as explanatory Variable.

Coefficients Estimate Std. Error t value Pr(>|t|)

(Intercept) 0.60 0.66 0.91 0.36

pr.awa -2.13 0.85 -2.52 0.01 *

Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 2.24 on 307 degrees of freedom

(11 observations deleted due to missingness)

Multiple R-squared: 0.02, Adjusted R-squared: 0.02

Table 31. Bivariate Regression for func.diffi with pr.awai as explanatory Variable.

𝑓𝑢𝑛𝑐. 𝑑𝑖𝑓𝑓𝑖 = 𝛽0 + 𝛽1 ∗ 𝑔𝑒𝑛. 𝑚𝑎𝑙𝑒𝑖

Equation 10. Bivariate Regression for func.diffi with gen.malei as explanatory Variable.

Coefficients Estimate Std. Error t value Pr(>|t|)

(Intercept) -1.38 0.19 -7.21 4.61e-12 ***

gen.male 0.59 0.26 2.29 0.02 *

Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 2.24 on 299 degrees of freedom

(19 observations deleted due to missingness)

Multiple R-squared: 0.02, Adjusted R-squared: 0.01

Table 32. Bivariate Regression for func.diffi with gen.malei as explanatory Variable.

𝑓𝑢𝑛𝑐. 𝑑𝑖𝑓𝑓𝑖 = 𝛽0 + 𝛽1 ∗ 𝑒𝑑𝑢𝑐𝑖

Equation 11. Bivariate Regression for func.diffi with educi as explanatory Variable.

Coefficients Estimate Std. Error t value Pr(>|t|)

(Intercept) -2.29 0.47 -4.90 1.6e-06 ***

educ 0.35 0.13 2.74 0.007 **

Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 2.24 on 291 degrees of freedom

(27 observations deleted due to missingness)

Multiple R-squared: 0.03, Adjusted R-squared: 0.02

Table 33. Bivariate Regression for func.diffi with educi as explanatory Variable.

xviii Appendix

𝑔𝑟. 𝑓𝑢𝑛𝑐. 𝑎𝑙𝑙𝑖 = 𝛽0 + 𝛽1 ∗ 𝑒𝑑𝑢𝑐𝑖

Equation 12. Bivariate Regression for gr.func.alli with educi as explanatory Variable.

Coefficients Estimate Std. Error t value Pr(>|t|)

(Intercept) 2.15 0.45 4.81 2.4e-06 ***

educ 0.02 0.12 0.14 0.89

Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 2.134 on 291 degrees of freedom

(27 observations deleted due to missingness)

Multiple R-squared: 6.614e-05, Adjusted R-squared: -0.003

Table 34. Bivariate Regression for gr.func.alli with educi as explanatory Variable.

𝑓𝑢𝑛𝑐. 𝑑𝑖𝑓𝑓𝑖 = 𝛽0 + 𝛽1 ∗ 𝑒𝑥𝑝𝑔𝑟𝑖 + 𝛽2 ∗ 𝑝𝑟. 𝑎𝑤𝑎𝑖 + 𝛽3 ∗ 𝑔𝑒𝑛. 𝑚𝑎𝑙𝑒𝑖 + 𝛽4 ∗ 𝑒𝑑𝑢𝑐𝑖

Equation 13. Multiple Linear Regression for func.diffi.

Coefficients Estimate Std. Error t value Pr(>|t|)

(Intercept) -0.21 0.89 -0.23 0.82

exp.gr -0.30 0.12 -2.57 0.01 *

pr.awa -2.20 0.92 -2.38 0.01 *

gen.male 0.49 0.26 1.87 0.06 .

educ 0.37 013 2.92 0.004 **

Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 2.24 on 291 degrees of freedom

(27 observations deleted due to missingness)

Multiple R-squared: 0.03, Adjusted R-squared: 0.02

Table 35. Multiple Linear Regression for func.diffi.

i

References

1. Warren, S.D., Brandeis, L.D.: The Right to Privacy. Harvard Law Review, 193–220 (1890)

2. Westin, A.F.: Privacy and Freedom. Washington and Lee Law Review 25(1), 166–170 (1968)

3. Jones, S.: Encyclopedia of New Media. An essential Reference to Communication and Technology. Sage

Publications (2002)

4. Ellison, N.B.: Social Network Sites. Definition, History, and Scholarship. Journal of Computer‐Mediated

Communication 13(1), 210–230 (2007)

5. Westin, A.F.: Social and Political Dimensions of Privacy. Journal of Social Issues 59(2), 431–453 (2003)

6. Orwell, G.: Nineteen Eighty-Four. Secker & Warburg, London, GB (1949)

7. Hammelehle, S.: Prism-Skandal: Orwells "1984" wird in den USA und Großbritannien wieder zum

Bestseller. http://spon.de/adXIS (2013). Accessed 16.11.29017

8. Bollier, D., Firestone, C.M.: The Promise and Peril of Big Data. Aspen Institute, Communications and

Society Program Washington, DC (2010)

9. Smith, H.J., Dinev, T., Xu, H.: Information Privacy Research. An Interdisciplinary Review. MIS quarterly

35(4), 989–1016 (2011)

10. Lyon, D.: Surveillance, Snowden, and Big Data. Capacities, Consequences, Critique. Big Data & Society

1(2) (2014)

11. Seltzer, M.: The Surveillance State of Today. World Economic Forum Annual Meeting, Davos, Swiss, 22

January 2015. https://youtu.be/l51S4dL6BHE. Accessed 3 May 2016

12. Doward, J., Gibbs, A.: Did Cambridge Analytica influence the Brexit vote and the US election?

https://goo.gl/ZKsZJJ (2017). Accessed 16 November 2017

13. Jones, J.J., Bond, R.M., Bakshy, E., Eckles, D., Fowler, J.H.: Social Influence and political Mobilization.

Further Evidence from a randomized Experiment in the 2012 U.S. Presidential Election. PloS one (2017).

doi: 10.1371/journal.pone.0173851

14. O'Neil, C.: Weapons of Math Destruction. How Big Data increases Inequality and Threatens Democracy.

Broadway Books (2017)

15. United Nations: Universal Declaration of Human Rights. UN (1948)

16. Dinev, T.: Why would we care about Privacy? EJIS 23(2), 97–102 (2014)

17. Zhang, C., Sun, J., Zhu, X., Fang, Y.: Privacy and Security for Online Social Networks. Challenges and

Opportunities. IEEE Network 24(4), 13–18 (2010)

18. Lanier, J.: Who owns the Future? Simon and Schuster (2014)

19. Zuboff, S.: Lasst euch nicht enteignen! Unsere Zukunft mit "Big Data". http://www.faz.net/-gsf-7twrt

(2014). Accessed 23 August 2017

20. Mischel, W.: Personality and Assessment. Psychology Press (1968)

21. Pariser, E.: The Filter Bubble. What the Internet is hiding from you. Penguin UK (2011)

22. Marichal, J.: Facebook Democracy. The Architecture of Disclosure and the Threat to Public Life.

Routledge (2016)

23. ITU: Number of Internet Users worldwide from 2005 to 2016. https://goo.gl/SF9uVN (2016). Accessed 7

December 2017

24. Facebook: Facebook Q4 2016 Results. Facebook. https://goo.gl/QwE5c3 (2016). Accessed 7 December

2017

25. Posner, R.A.: Privacy, Surveillance, and Law. The University of Chicago Law Review 75(1), 245–260

(2008)

26. Fuchs, C., Boersma, K., Albrechtslund, A., Sandoval, M.: Internet and Surveillance. The Challenges of

Web 2.0 and Social Media. Routledge (2013)

27. Nolte, C.-G., Zimmermann, C., Müller, G.: Social Network Services' Market Structure and its Influence on

Privacy. In: Amsterdam Privacy Conference, pp. 269–271 (2015)

28. Nolte, C.-G.: Personal Data as Payment Method in SNS and Users’ concerning Price Sensitivity - A Survey.

In: International Conference on Business Information Systems, pp. 273–282. Springer (2015)

29. Zimmermann, C., Nolte, C.-G.: Towards Balancing Privacy and Efficiency. A Principal-Agent Model of

Data-Centric Business. In: International Workshop on Security and Trust Management, pp. 89–104.

Springer (2015)

30. Nolte, C.-G., Brenig, C., Müller, G.: Coherences on Privacy in Social Network Services. A Qualitative

System Dynamics Analysis. In: IFIP Summer School, Karlstad, 21-26.08.2016 (2016)

31. Nolte, C.-G., Schwarz, J., Zimmermann, C.: Social Network Services: Competition and Privacy. In:

Internationale Tagung Wirtschaftsinformatik, St. Gallen (2017)

32. Nolte, C.-G., Rosenberg, B.: The General Data Protection Regulation’s Impact on Privacy. An Assessment

for Social Networks. Submitted for Review: Business & Information Systems Engineering (2018)

ii References

33. Nolte, C.-G.: Options to enhance Privacy Competition in the Social Network Market. Submitted for

Review: Journal of the Association for Information Systems (2018)

34. European Union: Regulation (EU) 2016/679 & Directive (EU) 2016/680 (2016)

35. McAfee, A., Brynjolfsson, E., Davenport, T.H.: Big Data. The Management Revolution. Harvard Business

Review 90(10), 60–68 (2012)

36. World Economic Forum: Personal Data: The Emergence of a New Asset Class. https://goo.gl/EGWqj5

(2011). Accessed 6 December 2017

37. Schermer, B.W.: The Limits of Privacy in Automated Profiling and Data Mining. Computer Law &

Security Review 27(1), 45–52 (2011)

38. Tene, O., Polonetsky, J.: Privacy in the Age of Big Data. A Time for Big Decisions. Stanford Law Review

Online (2012)

39. Beuscart, J.-S., Mellet, K.: Business Models of the Web 2.0. Advertising or the Tale of Two Stories.

Communications & Strategies, Special Issue (2009)

40. Google: The next Chapter for Flu Trends. https://goo.gl/8vNTN7 (2015). Accessed 16 November 2017

41. Enders, A., Hungenberg, H., Denker, H.-P., Mauch, S.: The long Tail of Social Networking. Revenue

Models of Social Networking Sites. European Management Journal 26(3), 199–211 (2008)

42. Müller, G., Flender, C., Peters, M.: Vertrauensinfrastruktur und Privatheit als ökonomische Fragestellung.

In: Internet Privacy, pp. 143–188. Springer (2012)

43. Evans, D.S.: The Economics of the Online Advertising Industry. Review of network economics (2008).

doi: 10.2202/1446-9022.1154

44. Staykova, K.S., Damsgaard, J.: A Typology of Multi-Sided Platforms: The Core and the Periphery. In:

European Conference on Information Systems (2015)

45. Shapiro, C., Varian, H.R.: Information Rules. A Strategic Guide to the Network Economy. Harvard

Business Press (1999)

46. Kane, G.C., Alavi, M., Labianca, G.J., Borgatti, S.: What’s different about Social Media Networks? A

Framework and Research Agenda. MIS Quarterly, forthcoming (2012)

47. Buchmann, J. (ed.): Internet Privacy. Options for adequate Realisation. acatech Study Series. Springer

(2013)

48. Buchmann, J. (ed.): Internet Privacy. A multidisciplinary Analysis. acatech Study Series. Springer (2013)

49. Kox, H.L.M., Straathof, B., Zwart, G.: Targeted Advertising, Platform Competition and Privacy (2014)

50. Hagiu, A., Wright, J.: Multi-Sided Platforms. International Journal of Industrial Organization 43, 162–174

(2015)

51. Hallinan, D., Friedewald, M., McCarthy, P.: Citizens' Perceptions of Data Protection and Privacy in

Europe. Computer Law & Security Review 28(3), 263–272 (2012)

52. Purtova, N.: Property Rights in Personal Data. A European Perspective. Kluwer Law International (2012)

53. Carew, P., Stapleton, L.: Towards a Privacy Framework for Information Systems Development. In:

Vaselicas, O., Wojtowski, W., Wojtowski, G. (eds.) Information Systems Development: Advances in

Theory, Practice and Education, pp. 77–88. Kluwer Academic Press (1999)

54. Burgoon, J.K.: Privacy and Communication. Annals of the International Communication Association 6(1),

206–249 (1982)

55. Pedersen, D.M.: Model for Types of Privacy by Privacy Functions. Journal of Environmental Psychology

19(4), 397–405 (1999)

56. Altman, I.: Privacy - A Conceptual Analysis. Environment and Behavior 8(1), 7–29 (1976)

57. Newell, P.B.: A cross-cultural Comparison of Privacy Definitions and Functions. A Systems Approach.

Journal of Environmental Psychology 18(4), 357–371 (1998)

58. Clarke, R.: Introduction to Dataveillance and Information Privacy, and Definitions of Terms.

https://goo.gl/z3vd2Z (1999). Accessed 6 December 2017

59. Solove, D.J.: Understanding Privacy. Harvard University Press, Cambridge, Mass (2008)

60. Finn, R.L., Wright, D., Friedewald, M.: Seven Types of Privacy. In: European Data Protection: Coming of

Age, pp. 3–32. Springer (2013)

61. Duggan, M., Ellison, N.B., Lampe, C., Lenhart, A., Madden, M.: Social Media Update 2014.

https://goo.gl/44Kgj5 (2015). Accessed 7 December 2017

62. Zittrain, J.: The Future of the Internet - and how to stop it. Yale University Press (2008)

63. Posner, R.A.: The Economics of Privacy. The American Economic Review 71(2), 405–409 (1981)

64. Bundesverfassungsgericht (BVerfG) (1983)

65. Fire, M., Goldschmidt, R., Elovici, Y.: Online Social Networks. Threats and Solutions. IEEE

Communications Surveys & Tutorials 16(4), 2019–2036 (2014)

66. Novotny, A., Spiekermann, S.: Personal Information Markets and Privacy. A new Model to solve the

Controversy. In: Internationale Tagung Wirtschaftsinformatik, Leipzig (2013)

References iii

67. Lin, K.-Y., Lu, H.-P.: Why People use Social Networking Sites. An empirical Study integrating Network

Externalities and Motivation Theory. Computers in Human Behavior 27(3), 1152–1161 (2011)

68. Litt, E.: Understanding Social Network Site Users’ Privacy Tool Use. Computers in Human Behavior

29(4), 1649–1656 (2013)

69. Krasnova, H., Spiekermann, S., Koroleva, K., Hildebrand, T.: Online Social Networks. Why we disclose.

Journal of Information Technology 25(2), 109–125 (2010)

70. Norberg, P.A., Horne, D.R., Horne, D.A.: The Privacy Paradox. Personal Information Disclosure Intentions

versus Behaviors. Journal of Consumer Affairs 41(1), 100–126 (2007)

71. Garg, V., Benton, K., Camp, L.J.: The Privacy Paradox. A Facebook Case Study. In: Research Conference

on Communication, Information and Internet Policy, pp. 1–40 (2014)

72. Stutzman, F., Gross, R., Acquisti, A.: Silent Listeners. The Evolution of Privacy and Disclosure on

Facebook. Journal of Privacy and Confidentiality 4(2), 7–41 (2013)

73. Acquisti, A., John, L.K., Loewenstein, G.: What is Privacy worth? The Journal of Legal Studies 42(2),

249–274 (2013)

74. Grossklags, J., Acquisti, A.: When 25 Cents is Too Much. An Experiment on Willingness-To-Sell and

Willingness-To-Protect Personal Information. In: Workshop on the Economics of Information Security

(2007)

75. Bauer, C., Korunovska, J., Spiekermann, S.: On the Value of Information. What Facebook Users are willing

to pay. In: European Conference on Information Systems (2012)

76. Taylor, C.R.: Consumer Privacy and the Market for Customer Information. RAND Journal of Economics,

631–650 (2004)

77. Tucker, C.: Economics of Privacy and User‐Generated Content. Emerging Trends in the Social and

Behavioral Sciences: An Interdisciplinary, Searchable, and Linkable Resource (2015)

78. Dinev, T., Hart, P.: An extended Privacy Calculus Model for E-Commerce Transactions. Information

Systems Research 17(1), 61–80 (2006)

79. Chellappa, R.K., Shivendu, S.: An Economic Model of Privacy. A Property Rights Approach to Regulatory

Choices for Online Personalization. Journal of Management Information Systems 24(3), 193–225 (2007)

80. Krasnova, H., Veltri, N.F.: Privacy Calculus on Social Networking Sites. Explorative Evidence from

Germany and USA. In: 43rd Hawaii International Conference on System Sciences, pp. 1–10. IEEE (2010)

81. Haucap, J., Heimeshoff, U.: Google, Facebook, Amazon, eBay. Is the Internet driving Competition or

Market Monopolization? International Economics and Economic Policy 11(1-2), 49–61 (2014)

82. Fjell, K., Foros, Ø., Steen, F.: The Economics of Social Networks. The Winner takes it all?, 42.

https://goo.gl/fiajAC (2010). Accessed 7 December 2017

83. Mankiw, N.G.: Principles of Macroeconomics. Cengage Learning (2014)

84. Ciriani, S.: The Economic Impact of the European Reform of Data Protection. Communications &

Strategies(97), 41–58 (2015)

85. Ryz, L., Grest, L.: A new Era in Data Protection. Computer Fraud & Security(3), 18–20 (2016)

86. Allaert, F.A., Barber, B.: Some systems implications of EU data protection directive. European Journal of

Information Systems 7(1), 1–4 (1998)

87. Dammann, U.: Erfolge und Defizite der EU-Datenschutzgrundverordnung. Erwarteter Fortschritt,

Schwächen und überraschende Innovationen, ZD 307 (2016)

88. Hansen, M.: Data Protection by Design and by Default à la European General Data Protection Regulation.

In: Lehmann, A., Whitehouse, D., Fischer-Hübner, S., Fritsch, L., Raab, C. (eds.) Privacy and Identity

Management. Facing up to Next Steps. 11th IFIP International Summer School, Karlstad, Sweden, pp. 27–

38. Springer (2016)

89. Bonneau, J., Preibusch, S.: The Privacy Jungle. On the Market for Data Protection in Social Networks. In:

Economics of Information Security and Privacy, pp. 121–167. Springer (2010)

90. Casadesus-Masanell, R., Hervas-Drane, A.: Competing with Privacy. Management Science 61(1), 229–246

(2015)

91. Dimakopoulos, P., Sudaric, S.: Privacy and Platform Competition. BDPEMS. https://goo.gl/2rjZcu (2017).

Accessed 6 December 2017

92. Alan, R.H. von, March, S.T., Park, J., Ram, S.: Design Science in Information Systems Research. MIS

Quarterly 28(1), 75–105 (2004)

93. Vestager, M.: Competition in a Big Data World. DLD Conferences, Munich, Germany, 17 January 2016.

https://goo.gl/jF9SQb. Accessed 16 November 2017

94. Evans, D.S., Schmalensee, R.: Markets with Two-Sided Platforms. Issues in Competition Law and Policy

(ABA Section of Antitrust Law) 1(28) (2008)

95. Evans, D.S., Schmalensee, R.: Failure to Launch: Critical Mass in Platform Businesses. Review of

Network Economics 9(4) (2010)

iv References

96. Maxwell, J.A.: Designing a qualitative Study. The SAGE handbook of Applied Social Research Methods 2,

214–253 (2008)

97. Rowley, J., Slack, F.: Conducting a Literature Review. Management Research News 27(6), 31–39 (2004)

98. Forrester, J.W.: Industrial Dynamics. Martino Fine Books (2013)

99. Rochet, J.‐C., Tirole, J.: Platform Competition in Two‐Sided Markets. Journal of the European Economic

Association 1(4), 990–1029 (2003)

100. Armstrong, M.: Competition in Two‐Sided Markets. The RAND Journal of Economics 37(3), 668–691

(2006)

101. Knoll, J.: Advertising in Social Media. A Review of Empirical Evidence. International Journal of

Advertising 35(2), 266–300 (2016)

102. Kwon, H.E., Oh, W., Kim, T.-H.: One-Sided Competition in Two-Sided Social Platform Markets? An

Organizational Ecology Perspective. In: International Conference on Information Systems, Fort Worth

(2015)

103. Tucker, C.E.: Social Networks, Personalized Advertising, and Privacy Controls. Journal of Marketing

Research 51(5), 546–562 (2014)

104. Tucker, C.: Social Advertising. How Advertising that explicitly promotes Social Influence can Backfire.

SSRN (2016)

105. Lawani, O., Aïmeur, E., Dalkir, K.: Improving Users’ Trust through friendly Privacy Policies. An Empirical

Study. International Conference on Risks and Security of Internet and Systems, 55–70 (2015)

106. Acquisti, A., Gross, R.: Imagined Communities. Awareness, Information Sharing, and Privacy on the

Facebook. In: International Workshop on Privacy Enhancing Technologies, pp. 36–58 (2006)

107. Schudy, S., Utikal, V.: 'You must not know about me'. On the Willingness to share Personal Data. Journal

of Economic Behavior & Organization(141), 1–13 (2017)

109. Eisenmann, T., Parker, G., van Alstyne, M.W.: Strategies for Two-Sided Markets. Harvard Business

Review 84(10), 92–103 (2006)

110. Ahn, D.-Y., Duan, J.A., Mela, C.F.: An Equilibrium Model of User Generated Content. NET Institute

Working Paper 11(13), 1–53 (2011)

111. Zhang, K., Sarvary, M.: Differentiation with User-Generated Content. Management Science 61(4), 898–

914 (2014)

112. Martin, K.E.: Transaction Costs, Privacy, and Trust. The laudable Goals and ultimate Failure of Notice and

Choice to respect Privacy online. First Monday 18(12-2), 1–21 (2013)

113. Torres, A.M.: Social Networking and Online Privacy. Facebook Users' Perceptions. Irish Journal of

Management 31(2), 63–97 (2012)

114. Barary Savadkoohi, F.: Personalized Online Promotions. Long-term Impacts on Customer Behavior,

Massachusetts Institute of Technology (2012)

115. Evans, D.S., Schmalensee, R.: The Antitrust Analysis of Multi-Sided Platform Businesses. Coase-Sandor

Institute of Law & Economics Papers 623(1), 1–45 (2012)

116. Hyytinen, A., Takalo, T.: Multihoming in the Market for Payment Media: Evidence from young Finnish

Consumers. Bank of Finland Research Discussion Paper(25) (2004)

117. Mital, M., Sarkar, S.: Multihoming Behavior of Users in Social Networking Web Sites. A theoretical

Model. Information Technology & People 24(4), 378–392 (2011)

118. Choi, J.P.: Tying in Two‐Sided Markets with Multi‐Homing. The Journal of Industrial Economics 58(3),

607–626 (2010)

119. Doganoglu, T., Wright, J.: Multihoming and Compatibility. International Journal of Industrial Organization

24(1), 45–67 (2006)

120. Jerome, J.: Buying and Selling Privacy. Big Data's different Burdens and Benefits. Stanford Law Review

Online (2013)

121. Debatin, B., Lovejoy, J.P., Horn, A.‐K., Hughes, B.N.: Facebook and Online Privacy. Attitudes, Behaviors,

and unintended Consequences. Journal of Computer‐Mediated Communication 15(1), 83–108 (2009)

122. Prigg, M.: One more reason to Google yourself: Search Giant to add Privacy Information letting Users see

what it knows about them. http://dailym.ai/1UugkVe (2016). Accessed 16 November 2017

123. The Irish Times: Facebook introduces new Privacy Controls. https://goo.gl/74AMHK (2008). Accessed 21

July 2016

124. Facebook: Facebook Reports First Quarter 2016 Results, California (2016)

125. McKeon, M.: The Evolution of Privacy on Facebook. Changes in Default Profile Settings over Time.

https://goo.gl/kYibv1 (2010). Accessed 7 December 2017

126. Shore, J., Steinman, J.: Did you really Agree to That? The Evolution of Facebook’s Privacy Policy.

Technology Science (2015)

127. Gross, D.: Facebook Privacy now defaults to Friends only. https://goo.gl/hQZ9ur (2014). Accessed 16

November 2017

References v

128. Reckhow, M.: Introducing Instant Articles. https://media.fb.com/2015/05/12/instantarticles/ (2015).

Accessed 16 November 2017

129. Constine, J.: Facebook Watch original Video Tab launches to all U.S. Users. https://goo.gl/MPdYis (2017).

Accessed 16 November 2017

130. Hui, K.-L., Tan, B.C.Y., Goh, C.-Y.: Online Information Disclosure. Motivators and Measurements. ACM

Transactions on Internet Technology (TOIT) 6(4), 415–441 (2006)

131. Krasnova, H., Hildebrand, T., Guenther, O.: Investigating the Value of Privacy on Online Social Networks.

Conjoint analysis. In: International Conference on Information Systems (2009)

132. Acquisti, A.: The Economics of Personal Data and the Economics of Privacy. Carnegie Mellon University.

http://repository.cmu.edu/heinzworks/332/ (2010). Accessed 30 August 2017

133. Hull, G., Lipford, H.R., Latulipe, C.: Contextual Gaps. Privacy Issues on Facebook. Ethics and information

technology 13(4), 289–302 (2011)

134. Hargittai, E.: Facebook Privacy Settings. Who cares? First Monday 15(8) (2010)

135. Dimensional Research: GDPR: Perceptions and Readiness. A Global Survey of Data Privacy Professionals

at Companies with European Customers. Dimensional Research. https://goo.gl/BRTQuc (2016). Accessed

4 December 2017

136. Netter, M., Riesner, M., Weber, M., Pernul, G.: Privacy Settings in Online Social Networks. Preferences,

Perception, and Reality. In: 46th Hawaii International Conference on System Sciences, pp. 3219–3228.

IEEE (2013)

137. Law, J.: A Dictionary of Business and Management. Oxford University Press (2016)

138. Netter, M., Herbst, S., Pernul, G.: Interdisciplinary Impact Analysis of Privacy in Social Networks. In:

Security and Privacy in Social Networks, pp. 7–26. Springer (2013)

139. Malhotra, N.K., Kim, S.S., Agarwal, J.: Internet Users' Information Privacy Concerns (IUIPC). The

Construct, the Scale, and a Causal Model. Information systems research 15(4), 336–355 (2004)

140. Varian, H.R.: Economic Aspects of Personal Privacy. In: Internet Policy and Economics, pp. 101–109.

Springer (2009)

141. Franke, N., Keinz, P., Steger, C.J.: Testing the Value of Customization. When do Customers really prefer

Products tailored to their Preferences? Journal of marketing 73(5), 103–121 (2009)

142. Weitzner, D.J.: Google, Profiling, and Privacy. IEEE Internet Computing 11(6) (2007)

143. Laudon, K.C.: Markets and privacy. Communications of the ACM 39(9), 92–104 (1996)

144. Schwartz, P.M.: Property, Privacy, and Personal Data. Harvard law review 117(1-1), 2056–2128 (2003)

145. Spiekermann, S., Novotny, A.: A Vision for global Privacy Bridges. Technical and legal Measures for

International Data Markets. Computer Law & Security Review 31(2), 181–200 (2015)

146. Acquisti, A.: Privacy in electronic Commerce and the Economics of immediate Gratification. In: 5th ACM

Conference on Electronic Commerce, pp. 21–29. ACM (2004)

147. Campbell, J.E., Carlson, M.: Panopticon.com. Online Surveillance and the Commodification of Privacy.

Journal of Broadcasting & Electronic Media 46(4), 586–606 (2002)

148. Davies, S.G.: Re-Engineering the Right to Privacy. How Privacy has been transformed from a Right to a

Commodity. In: Technology and Privacy, pp. 143–165. MIT Press (1997)

149. Bergelson, V.: It's personal but is it mine? Toward Property Rights in Personal Information. UC Davis L.

Rev. 37, 379–451 (2003)

150. Cuijpers, C.: A Private Law Approach to Privacy. Mandatory Law obliged. SCRIPTed (2007). doi:

10.2966/scrip.040407.304

151. Lessig, L.: Privacy as Property. Social Research 69(1), 247–269 (2002)

152. Purtova, N.: Property Rights in Personal Data. Learning from the American Discourse. Computer Law &

Security Review 25(6), 507–521 (2009)

153. Varian, H.R.: Intermediate Microeconomics. A Modern Approach. WW Norton & Company (2014)

154. Ackerman, M.S., Cranor, L.F., Reagle, J.: Privacy in E-Commerce. Examining User Scenarios and Privacy

Preferences. In: 1st ACM Conference on Electronic Commerce, pp. 1–8. ACM (1999)

155. Westin, A.F.: Harris Louis & Associates: Harris-Equifax Consumer Privacy Survey (1991)

156. Gross, R., Acquisti, A.: Information Revelation and Privacy in Online Social Networks. In: ACM

Workshop on Privacy in the Electronic Society, pp. 71–80. ACM (2005)

157. Spiekermann, S., Dickinson, I., Günther, O., Reynolds, D.: User Agents in E-Commerce Environments.

Industry vs. Consumer Perspectives on Data Exchange. In: Advanced Information Systems Engineering, p.

1029. Springer, Berlin/Heidelberg (2003)

158. Shapiro, S.P.: Agency Theory. Annual Review of Sociology 31 (2005)

159. McDonald, A.M., Cranor, L.F.: Cost of Reading Privacy Policies. ISJLP 4, 543–568 (2008)

160. Nissenbaum, H.: A Contextual Approach to Privacy Online. Daedalus 140(4), 32–48 (2011)

161. van Blarkom, G.W., Borking, J.J., Olk, J.G.E.: Handbook of Privacy and Privacy-Enhancing Technologies.

The Case of Software Agents. The Hague (2003)

vi References

162. StatCounter: Marktanteile von Social Media Seiten nach Seitenabrufen weltweit im Juni 2017.

https://goo.gl/o89egQ (2017). Accessed 4 December 2017

163. Akerlof, G.A.: The Market for "Lemons". Quality Uncertainty and the Market Mechanism. The Quarterly

Journal of Economics, 488–500 (1970)

164. Janic, M., Wijbenga, J.P., Veugen, T.: Transparency Enhancing Tools (TETs). An Overview. In: 3rd

Workshop on Socio-Technical Aspects in Security and Trust, pp. 18–25. IEEE (2013)

165. Hansen, M.: Marrying Transparency Tools with User-controlled Identity Management. The Future of

Identity in the Information Society, 199–220 (2008)

166. Cranor, L.F., Langheinrich, M., Marchiori, M.: A P3P Preference Exchange Language 1.0. (APPEL1.0).

W3C Working Draft. https://goo.gl/NbsZg7 (2002). Accessed 7 December 2017

167. Pretschner, A., Hilty, M., Basin, D.: Distributed Usage Control. Communications of the ACM 49(9), 39–44

(2006)

168. Hanson, C., Berners-Lee, T., Kagal, L., Sussman, G.J., Weitzner, D.J.: Data-Purpose Algebra. Modeling

Data Usage Policies. In: Policies for Distributed Systems and Networks, pp. 173–177. IEEE (2007)

169. Bohrer, K., Liu, X., Kesdogan, D., Schonberg, E., Singh, M., Spraragen, S.: Personal Information

Management and Distribution. In: 4th International Conference on Electronic Commerce Research (2001)

170. Jøsang, A., Ismail, R., Boyd, C.: A Survey of Trust and Reputation Systems for Online Service Provision.

Decision support systems 43(2), 618–644 (2007)

171. Howe, D.C., Nissenbaum, H.: TrackMeNot. Resisting Surveillance in Web Search. Lessons from the

Identity trail: Anonymity, privacy, and identity in a networked society 23, 417–436 (2009)

172. Ashley, P., Powers, C., Schunter, M.: From Privacy Promises to Privacy Management. A new Approach for

Enforcing Privacy throughout an Enterprise. In: Workshop on New Security Paradigms, pp. 43–50. ACM

(2002)

173. Mont, M.C., Pearson, S., Bramhall, P.: Towards accountable Management of Identity and Privacy. Sticky

Policies and enforceable Tracing Services. In: 14th International Workshop on Database and Expert

Systems Applications, pp. 377–382. IEEE (2003)

174. Zimmermann, C., Accorsi, R., Müller, G.: Privacy Dashboards. Reconciling data-driven Business Models

and Privacy. In: 9th International Conference on Availability, Reliability and Security, pp. 152–157. IEEE

(2014)

175. Buchmann, J., Nebel, M., Roßnagel, A., Shirazi, F., Fhom, H.S., Waidner, M.: Personal Information

Dashboard. Putting the Individual Back in Control. Digital Enlightenment Yearbook 2013: The Value of

Personal Data, 139–164 (2013)

176. Fischer-Hübner, S., Hedbom, H., Wästlund, E.: Trust and Assurance HCI. Privacy and Identity

Management for Life, 245–260 (2011)

177. Zimmermann, C., Cabinakova, J.: A Conceptualization of Accountability as a Privacy Principle. In:

International Conference on Business Information Systems, pp. 261–272. Springer (2015)

178. Pearson, S., Charlesworth, A.: Accountability as a Way forward for Privacy Protection in the Cloud. In:

IEEE International Conference on Cloud Computing, pp. 131–144. IEEE (2009)

179. Becker, H.S.: Writing for Social Scientists. How to start and finish your Thesis, Book, or Article.

ReadHowYouWant.com (2010)

180. Dinev, T., Xu, H., Smith, J.H., Hart, P.: Information Privacy and Correlates. An empirical Attempt to bridge

and distinguish Privacy-related Concepts. European Journal of Information Systems 22(3), 295–316 (2013)

181. Crawford, K., Schultz, J.: Big Data and due Process. Toward a Framework to redress Predictive Privacy

Harms. Boston College Law Review 55(1), 93–130 (2014)

182. Trepte, S., Teutsch, D., Masur, P.K., Eicher, C., Fischer, M., Hennhöfer, A., Lind, F.: Do People know about

Privacy and Data Protection Strategies? Towards the "Online Privacy Literacy Scale" (OPLIS). In:

Reforming European Data Protection Law, pp. 333–365. Springer (2015)

183. Petkos, G., Papadopoulos, S., Kompatsiaris, Y.: PScore. A Framework for Enhancing Privacy Awareness in

Online Social Networks. In: International Conference on Availability, Reliability and Security, pp. 592–

600. IEEE (2015)

184. Zeng, Y., Sun, Y., Xing, L., Vokkarane, V.: A Study of Online Social Network Privacy Via the TAPE

Framework. IEEE Journal of Selected Topics in Signal Processing 9(7), 1270–1284 (2015)

185. Abril, P.S.: A (My) Space of one's own. On Privacy and Online Social Networks. Nw. J. Tech. & Intell.

Prop. 6(1), 73–88 (2007)

186. Preibusch, S., Hoser, B., Gürses, S., Berendt, B.: Ubiquitous Social Networks. Opportunities and

Challenges for Privacy-aware User Modelling. In: Workshop on Data Mining for User Modelling at UM

(2007)

187. Spiekermann, S., Acquisti, A., Böhme, R., Hui, K.-L.: The Challenges of Personal Data Markets and

Privacy. Electronic Markets 25(2), 161–167 (2015)

References vii

188. Weber, R.H.: The Digital Future - A Challenge for Privacy? Computer Law & Security Review 31(2), 234–

242 (2015)

189. Kumar, H., Jain, S., Srivastava, R.: Risk Analysis of Online Social Networks. In: International Conference

on Computing, Communication and Automation, pp. 846–851. IEEE (2016)

190. Krishnamurthy, B., Wills, C.E.: Characterizing Privacy in Online Social Networks. In: 1st Workshop on

Online Social Networks, pp. 37–42. ACM (2008)

191. Zheleva, E., Getoor, L.: To Join or not to Join. The Illusion of Privacy in Social Networks with mixed

public and private User Profiles. In: International Conference on World Wide Web, pp. 531–540. ACM,

Madrid (2009)

192. Shakimov, A., Cox, L.P.: Privacy Challenges in the Online Social Networking Era. In: 8th Middleware

Doctoral Symposium. ACM (2011)

193. Greschbach, B., Kreitz, G., Buchegger, S.: The Devil is in the Metadata. New Privacy Challenges in

decentralised Online Social Networks. In: IEEE International Conference on Pervasive Computing and

Communications Workshops, pp. 333–339. IEEE (2012)

194. Henne, B., Szongott, C., Smith, M.: SnapMe if you can. Privacy Threats of other Peoples' geo-tagged

Media and what we can do about it. In: 6th ACM Conference on Security and Privacy in Wireless and

Mobile Networks, pp. 95–106. ACM (2013)

195. Reinbold, F.: Selfie with Merkel Haunts Refugee. Dear Facebook, This Man Is Not a Terrorist.

http://spon.de/aeUkm (2017). Accessed 16 November 2017

196. Pötzsch, S. (ed.): Privacy Awareness. A Means to solve the Privacy Paradox? IFIP Summer School on the

Future of Identity in the Information Society. Springer (2008)

197. Aïmeur, E., Gambs, S., Ho, A.: Towards a Privacy-Enhanced Social Networking Site. In: International

Conference on Availability, Reliability and Security, pp. 172–179. IEEE (2010). doi:

10.1109/ARES.2010.97

198. Zimmermann, C.: Framework and Requirements for Reconciling digital Services and Privacy. In: European

Conference on Information Systems (2016)

199. Hawn, C.: Take two Aspirin and tweet me in the Morning. How Twitter, Facebook, and other Social Media

are reshaping Health Care. Health affairs 28(2), 361–368 (2009)

200. Newman, M.W., Lauterbach, D., Munson, S.A., Resnick, P., Morris, M.E.: It's not that I don't have

Problems, I'm just not putting them on Facebook. Challenges and Opportunities in using Online Social

Networks for Health. In: Conference on Computer supported Cooperative Work, pp. 341–350. ACM (2011)

201. Coviello, L., Sohn, Y., Di Kramer, A., Marlow, C., Franceschetti, M., Christakis, N.A., Fowler, J.H.:

Detecting emotional Contagion in massive Social Networks. PloS one 9(3) (2014)

202. Constine, J.: Facebook Launches “Nearby Friends” With Opt-In Real-Time Location Sharing To Help You

Meet Up. https://goo.gl/VNesvz (2014). Accessed 16 November 2017

203. Facebook: About Location Targeting. https://goo.gl/eialpk. Accessed 16 November 2017

204. Hert, P. de, Papakonstantinou, V.: The proposed Data Protection Regulation replacing Directive 95/46/EC:

A sound System for the Protection of Individuals. Computer Law & Security Review 28(2), 130–142

(2012)

205. Newman, A.L.: What the “Right to be Forgotten” means for Privacy in a Digital Age. Science 347(6221),

507–508 (2015)

206. AuGovPC: Data Availability and Use. Draft Report. https://goo.gl/kmqSND (2016). Accessed 7 December

2017

207. Angwin, J., Varner, M., Tobin, A.: Facebook Enabled Advertisers to Reach ‘Jew Haters’.

https://goo.gl/53iNUf (2017). Accessed 16 November 2017

208. Constine, J.: Facebook is building Brain-Computer Interfaces for Typing and Skin-Hearing.

https://goo.gl/5OC7mA (2017). Accessed 16 November 2017

209. Eurostat: Social Network Penetration in the European Union 2011-2016. Share of Individuals in the

European Union (EU 28) participating in Social Networks from 2011 to 2016. https://goo.gl/w5aMVA

(2016). Accessed 7 December 2017

210. eMarketer: Number of Social Media Users worldwide from 2010 to 2020 (in Billions).

https://goo.gl/mx2hCz (2017). Accessed 7 December 2017

211. van Est, R., Brom, F.: Technology Assessment. Analytic and Democratic Practice. Encyclopaedia of

Applied Ethics 4, 306–320 (2012)

212. Tan, D.R.: Personal Privacy in the Information Age. Comparison of Internet Data Protection Regulations in

the United States and European Union. Loy. L.A. Int'l & Comp. L. Rev. 21(4), 661–684 (1999)

213. Hornung, G.: A General Data Protection Regulation for Europe. Light and Shade in the Commission's Draft

of 25 January 2012. SCRIPTed (2012). doi: 10.2966/scrip.090112.64

214. Kuner, C.: The European Commission's proposed Data Protection Regulation: A copernican Revolution in

European Data Protection Law. Bloomberg BNA Privacy and Security Law Report 11(6), 1–15 (2012)

viii References

215. Traung, P.: The Proposed New EU General Data Protection Regulation: Further Opportunities. Computer

Law Review International(2), 33–49 (2012)

216. Mantelero, A.: The EU Proposal for a General Data Protection Regulation and the Roots of the ‘Right to be

Forgotten’. Computer Law & Security Review 29(3), 229–235 (2013)

217. Victor, J.M.: EU General Data Protection Regulation: Toward a Property Regime for Protecting Data

Privacy. Yale LJ 123(513), 513–528 (2013)

218. Kiss, A., Szőke, G.L.: Evolution or Revolution? Steps forward to a new Generation of Data Protection

Regulation. In: Reforming European Data Protection Law, pp. 311–331. Springer (2015)

219. Kolah, A., Foss, B.: Unlocking the Power of Data under the new EU General Data Protection Regulation.

Journal of Direct, Data and Digital Marketing Practice 16(4), 270–274 (2015)

220. Mayer-Schönberger, V.: Privacy by Regulation. Protecting Personal Data in the Age of Big Data. Keynotes

of Amsterdam Privacy Conference, Amsterdam, 2015

221. WPNO: Verschärfte Sanktionen gegen Datenschutzverstöße nach der DSGVO. Aufgaben der

Aufsichtsbehörde. https://goo.gl/UWu2k6 (2016). Accessed 29 June 2017

222. Schmitt, J., Stahl, F.: How the Proposed EU Data Protection Regulation Is Creating a Ripple Effect

Worldwide. IAPP Privacy Academy 2012, San Jose, CA, USA, 11 October 2012. https://goo.gl/oqEW8Q.

Accessed 4 December 2017

223. Parbel, M.: Verbraucherzentralen fordern „Algorithmen-Tüv“. https://heise.de/-3691265 (2017). Accessed

26.10.0217

224. Schantz, P.: Die Datenschutz-Grundverordnung–Beginn einer neuen Zeitrechnung im Datenschutzrecht.

Neue Juristische Wochenschrift (NJW) 69(26), 1841–1847 (2016)

225. Kerber, W.: Digital Markets, Data, and Privacy. Competition Law, Consumer Law and Data Protection.

Journal of Intellectual Property Law & Practice 11(11), 856–866 (2016)

226. Kamann, H.-G.: Kartellrecht und Datenschutzrecht. Verhältnis einer „Hass-Liebe “? In: Immenga, U.,

Körber, T. (eds.) Daten und Wettbewerb in der digitalen Ökonomie, pp. 59–80. Nomos Verlagsgesellschaft

mbH & Co. KG (2016)

227. Kühling, J.: Neues Bundesdatenschutzgesetz–Anpassungsbedarf bei Unternehmen. Neue Juristische

Wochenschrift: NJW 70(28), 1985–1990 (2017)

228. Roßnagel, A.: Europäische Datenschutz-Grundverordnung – Vorrang des Unionsrechts – Anwendbarkeit

des nationalen Rechts. Nomos Verlag, Baden-Baden (2016)

229. Dehmel, S., Hullen, N.: Auf dem Weg zu einem zukunftsfähigen Datenschutz in Europa? Konkrete

Auswirkungen der DS-GVO auf Wirtschaft, Unternehmen und Verbraucher. Zeitschrift für Datenschutz

3(4), 147–153 (2013)

230. Albers, M.: Prinzipien des Datenschutzrechts. Art. 6 Rn. 10. In: Wolff, H.A., Brink, S. (eds.) Beck’scher

Online-Kommentar Datenschutzrecht, 22nd edn., pp. 1–53 (2017)

231. Faust, S., Spittka, J., Wybitul, T.: Milliardenbußgelder nach der DS-GVO. Ein Überblick über die neuen

Sanktionen bei Verstößen gegen den Datenschutz, ZD 120 (2016)

232. Humphries, M.: Apple will be forced to use micro USB Chargers by 2017. https://goo.gl/7rBvo9 (2014).

Accessed 24 November 2017

233. Facebook: Annual Report 2016. Facebook. https://goo.gl/XreQpM (2017). Accessed 7 December 2017

234. Rodger, A.: Data Privacy Laws: Cutting the Red Tape. Ovum Consulting. https://goo.gl/yVb7St (2016).

Accessed 4 December 2017

235. Vanson Bourne LT: EU General Data Protection Regulation (GDPR). Are you ready for it?

https://goo.gl/JGjUKD (2016). Accessed 4 December 2017

236. Culnan, M.J., Bies, R.J.: Consumer Privacy. Balancing economic and justice Considerations. Journal of

Social Issues 59(2), 323–342 (2003)

237. Trémolet, S., Binderre, D.: Penalties for Non-Compliance. What Penalties are most effective when the

Operator is in non-compliance with Regulatory Rules (e.g. for providing Data, setting Prices, or meeting

Targets)? https://goo.gl/vRsjng (2010). Accessed 4 December 2017

238. Lipford, H.R., Besmer, A., Watson, J.: Understanding Privacy Settings in Facebook with an Audience

View. UPSEC 8, 1–8 (2008)

239. Kelley, P.G., Bresee, J., Cranor, L.F., Reeder, R.W.: A Nutrition Label for Privacy. In: 5th Symposium on

Usable Privacy and Security. ACM (2009)

240. Koops, B.-J.: Forgetting Footprints, shunning Shadows. A critical Analysis of the Right to be Forgotten in

Big Data Practice. SCRIPTed 8(3), 229–256 (2011)

241. Gibbs, S.: Germany orders Facebook to stop collecting WhatsApp User Data. https://goo.gl/avMFsO

(2016). Accessed 16.11.2017

242. Rusthon, K.: DuckDuckGo: The Privacy Search ruffling Google's feathers. goo.gl/TSH8nj (2014).

Accessed 25 November 2015

References ix

243. Bork, R.H., Sidak, J.G.: What Does the Chicago School Teach About Internet Search and the Antitrust

Treatment of Google? Journal of Competition Law and Economics 8(4), 663–700 (2012)

244. Haucap, J., Kehder, C.: Suchmaschinen zwischen Wettbewerb und Monopol: Der Fall Google. In:

Dewenter, R., Haucap, J., Kehder, C. (eds.) Wettbewerb und Regulierung in Medien, Politik und Märkten.

Festschrift für Jörn Kruse zum 65. Geburtstag, pp. 115–154. Nomos Verlagsgesellschaft mbH & Co. KG

(2013)

245. van Eecke, P.: Technology Firms and the European General Data Protection Regulation: How should they

prepare? http://goo.gl/hI7yQJ (2016). Accessed 16 November 2017

246. dpa/AP: Datenschutz-Anpassung: WhatsApp gibt Telefonnummern an Facebook weiter.

http://spon.de/aeOMD (2016). Accessed 25 November 2017

247. Sullivan, D.: Facebook Instant Articles: A Slippery Slope for Google to do the Same, hurting the Web?

goo.gl/ui0PZz (2015). Accessed 11 August 2016

248. Constine, J.: Wait, did Facebook just build a Kickstarter Competitor? goo.gl/Eagrp7 (2015). Accessed 16

November 2017

249. Kaiser, U., Wright, J.: Price Structure in Two-Sided Markets. Evidence from the Magazine Industry.

International Journal of Industrial Organization (2006). doi: 10.1016/j.ijindorg.2005.06.002

250. Athey, S., Calvano, E., Gans, J.S.: The Impact of Consumer Multi-Homing on Advertising Markets and

Media Competition. Management Science (2016). doi: 10.2139/ssrn.2180851

251. Olbrich, R., Holsing, C.: Facebook Ads. WiSt - Wirtschaftswissenschaftliches Studium (2014). doi:

10.15358/0340-1650_2014_10_557

252. eMarketer: Net Digital Advertising Revenue Share of major ad-selling Online Companies worldwide from

2012 to 2014. https://goo.gl/hNSvQ7 (2014). Accessed 25 November 2017

253. Stelzner, M.A.: 2017 Social Media Marketing Industry Report. How Marketers are using Social Media to

grow their Businesses. Social Media Examiner. https://goo.gl/wMPxzE (2017). Accessed 7 December 2017

254. DeLong, J.B., Summers, L.H.: The 'New Economy': Background, historical Perspective, Questions, and

Speculations. Economic Review-Federal Reserve Bank of Kansas City 86(4), 29–59 (2001)

255. Rosen, J.: The Right to be forgotten. Stanford Law Review Online (2012)

256. Terwangne, C. de: Internet Privacy and the Right to be Forgotten/Right to Oblivion. In: VII Congreso

Internacional Internet, Derecho y Política. Neutralidad de la red y otros retos para el futuro de Internet, vol.

13, pp. 109–121 (2012)

257. Infosecurity Magazine: Snapchat’s expired Snaps are not deleted, just hidden. https://goo.gl/WhDT2K

(2013). Accessed 16 November 2017

258. Russell, J.: Facebook Stories, yet another Snapchat Clone, is rolling out to more Countries.

http://tcrn.ch/2msoQHT (2017). Accessed 16 November 2017

259. Yoo, C.S.: When Antitrust met Facebook. Geo. Mason L. Rev. 19(5), 1147–1162 (2012)

260. Berners-Lee, T.: Long live the Web. Scientific American 303(6), 80–85 (2010)

261. Swire, P., Lagos, Y.: Why the Right to Data Portability likely reduces Consumer Welfare. Antitrust and

Privacy Critique. Md. L. Rev. 72, 335–380 (2012)

262. Weiss, S.: Privacy Threat Model for Data Portability in Social Network Applications. International Journal

of Information Management 29(4), 249–254 (2009)

263. Miller, P.: Interoperability. What is it and why should I want it? Ariadne(24) (2000)

264. Breslin, J., Bojars, U., Passant, A., Fernandez, S., Decker, S.: Sioc: Content Exchange and semantic

Interoperability between Social Networks. In: Workshop on the Future of Social Networking. W3C (2009)

265. European Committee for Interoperable Systems: Interoperability & Intraoperability. https://goo.gl/Wr5vGJ.

Accessed 7 December 2017

266. Diedrich, O.: Die Woche: Alle gegen Microsoft. https://goo.gl/Sc9kuX (2009). Accessed 16 November

2017

267. Martinez, F.: What is Interoperability? https://goo.gl/pkSWYU (2012). Accessed 7 December 2017

268. Hinchcliffe, D.: Where is Interoperability for Social Media? http://zd.net/1dJYeHF (2014). Accessed 16

November 2017

269. Chisnall, D.: Open Standards for Social Networks. https://goo.gl/ep3oz (2011). Accessed 16 November

2017

270. Saint-Andre, P.: Extensible Messaging and Presence Protocol (XMPP): Core. IETF. RFC 6120.

https://goo.gl/A9rY64 (2011). Accessed 7 December 2017

271. Jacobs, I.: OpenSocialFoundation moves Standards Work to W3C Social Web Activity.

https://goo.gl/2GTUCa (2014). Accessed 7 December 2017

272. Yeung, Ching-man Au and Liccardi, Ilaria and Lu, Kanghao, Seneviratne, O., Berners-Lee, T.:

Decentralization: The Future of Online Social Networking. In: Workshop on the Future of Social

Networking, pp. 2–7. W3C (2009)

x References

273. Shah, R.: Striving for Interoperability in Social Business. https://goo.gl/fmVSGz (2011). Accessed 4

October 2017

274. Rescorla, E.: HTTP over TLS. IETF. RFC 2818. https://goo.gl/C1qp2d (2000). Accessed 7 December 2017

275. Spronk, R.: Impact of the GDPR on the Use of Interoperability Standards. https://goo.gl/cso3U7 (2017).

Accessed 7 December 2017

276. Clarke, R.: The digital Persona and its Application to Data Surveillance. The Information Society 10(2),

77–92 (1994)

277. Nilakanta, S., Scheibe, K.: The Digital Persona and Trust Bank. A Privacy Management Framework.

Journal of Information Privacy and Security 1(4), 3–21 (2005)

278. Clarke, R.: Persona missing, feared drowned. The Digital Persona Concept, two Decades later. Information

Technology & People 27(2), 182–207 (2014)

279. Arrington, M.: OpenID Welcomes Microsoft, Google, Verisign and IBM. https://goo.gl/eRHe1Z (2008).

Accessed 16 November 2017

280. Rhoen, M.: Rear View Mirror, Crystal Ball. Predictions for the Future of Data Protection Law based on the

History of Environmental Protection Law. Computer Law & Security Review (2017). doi:

10.1016/j.clsr.2017.05.010

281. Pfanner, E.: French Tax Proposal Zeroes In on Web Giants’ Data Harvest. https://nyti.ms/2iRAfVS (2013).

Accessed 16 November 2017

282. Goolsbee, A., Zittrain, J.L.: Evaluating the Costs and Benefits of taxing Internet Commerce. The Quarterly

Journal of Economics 115(2), 561–576 (1999)

283. Voison, G.: A new Tax on Personal Data Collection? https://goo.gl/ZXKg3o (2013). Accessed 16

November 2017

284. Han, B.-C.: Unsere gefühlte Freiheit. https://goo.gl/UfjACv (2014). Accessed 16 November 2017

285. Evans, D.S.: Antitrust Issues Raised by the Emerging Global Internet Economy. Nw. UL Rev. 102(1), 285–

306 (2008)

286. Schmalensee, R.: Antitrust Issues in Schumpeterian Industries. The American Economic Review 90(2),

192–196 (2000)

287. Michal, W.: Heiko Maas und der Algorithmen-TÜV. https://goo.gl/XG3PGg (2017). Accessed 16

November 2017

288. Lobe, A.: Gebt die Algorithmen frei! Strategien der Digitalkonzerne. http://www.faz.net/-gqz-8z26h

(2017). Accessed 29 June 2017

289. Lipsman, A., Lella, A.: U.S. Cross-Platform Future in Focus. comScore. https://goo.gl/pS7wQb (2016).

Accessed 7 December 2017

290. Mancini, C., Thomas, K., Rogers, Y., Price, B.A., Jedrzejczyk, L., Bandara, A.K., Joinson, A.N., Nuseibeh,

B.: From Spaces to Places. Emerging Contexts in mobile Privacy. In: 11th International Conference on

Ubiquitous Computing, pp. 1–10. ACM (2009)

291. Almuhimedi, H., Schaub, F., Sadeh, N., Adjerid, I., Acquisti, A., Gluck, J., Cranor, L.F., Agarwal, Y.: Your

Location has been shared 5,398 Times! A Field Study on Mobile App Privacy Nudging. In: 33rd Annual

ACM Conference on Human Factors in Computing Systems, pp. 787–796. ACM (2015)

292. Mangalindan, J.P.: San Francisco's Rent Riot. http://for.tn/1goeUeT (2011). Accessed 16 November 2017