How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How...

66
How Private is Private?: Effects of Varied Transparency on Group Ideation Michael C. Stewart Thesis submitted to the Faculty of the Virginia Polytechnic Institute and State University in partial fulfillment of the requirements for the degree of Master of Science in Computer Science and Applications Deborah G. Tatar, Chair Steven R. Harrison D. Scott McCrickard Manuel A. P´ erez-Qui˜ nones November 29, 2012 Blacksburg, Virginia Keywords: Computer-Supported Cooperative Work, collaboration, Human-Computer Interaction, group work, sharing, transparency, privacy, ideation, group ideation, idea generation Copyright 2013, Michael C. Stewart

Transcript of How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How...

Page 1: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

How Private is Private?: Effects of Varied Transparency on GroupIdeation

Michael C. Stewart

Thesis submitted to the Faculty of theVirginia Polytechnic Institute and State University

in partial fulfillment of the requirements for the degree of

Master of Sciencein

Computer Science and Applications

Deborah G. Tatar, ChairSteven R. Harrison

D. Scott McCrickardManuel A. Perez-Quinones

November 29, 2012Blacksburg, Virginia

Keywords: Computer-Supported Cooperative Work, collaboration, Human-ComputerInteraction, group work, sharing, transparency, privacy, ideation, group ideation, idea

generationCopyright 2013, Michael C. Stewart

Page 2: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

How Private is Private?: Effects of Varied Transparency on Group Ideation

Michael C. Stewart

ABSTRACT

Many Computer-Supported Cooperative Work (CSCW) applications go to great lengths tomaximize transparency by making available participants’ actions and respective applicationstates to all others in real-time. Designers might intend to enhance coordination throughincreased transparency, but what other outcomes might be influenced by these choices? Wedeveloped two versions of a CSCW application to support a group idea generation task forcollocated groups. One version had diminished transparency in comparison to the other. Westudied the effects of this varied transparency on the groups’ generativity and collaboration.We found that in modulating transparency there was a trade-off between generativity andcollaboration. Groups with diminished transparency felt that their groupmates built ontheir ideas more, but groups with increased transparency were more generative. Thesefindings are tentative but suggest that the full story of group vs. solitary, private vs. publicmanipulations of technology, at least in the area of idea generation, is not yet sufficientlytheorized or understood.

Page 3: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Dedication

To my wife, Emily, whose love and support sustains and motivates me to work to cherishher the way I feel she cherishes me every single day.

iii

Page 4: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Acknowledgments

I, like many of my colleagues, came to Virginia Tech for a Ph.D. in Computer Science. Whilemy goal is to contribute at that level, I do not see the completion of this, my Master’s thesis,as a mere stepping stone. Certainly it has prepared me for my dissertation work, but ithas been a journey all its own. I am grateful for the many people in my life who havehelped me with my intellectual and personal pursuits, which have helped me achieve thisaccomplishment.

Thanks to my faculty committee: Deborah Tatar, Manuel Perez-Quinones, Steve Harrison,and Scott McCrickard for agreeing to mentor me in this process and asking the hard questionsand encouraging me on my way. Thanks to Mrs. Jessie Eaves for her her tireless smiles andreliable rescues when I failed to gain physical access to the study location. Without thehelp of several undergraduate researchers, I would not have been able to complete this work.Thank you Lindsay Blumberg, Harold William “Wil” Collins III, Christian Lutz, AverySandridge, John Krulick, and Nolan Henry. In completing this document, I learned quitea lot about the LaTeX Document Preparation System (LATEX); for their assistance in thisI am grateful to Ankit Ahuja, Robert “Bobby” J. Beaton III, StackExchange [49], and theLaTeX Table Generator [48]. Gary Bishop [6] mentored me as an undergraduate and wasinstrumental in my interest in research and my successful application to graduate school.For these, his role-modeling, and his friendship, I am eternally grateful.

I completed this work as a happy member of the the Third Lab [53], and I thank Joon-SukLee, Stacy Branham, Bobby Beaton, Samantha “Sammy” Yglesias, Jose Alvarado, DeborahTatar, and Steve Harrison for their support through discussion, questions, encouragement,and role-modeling. This work was supported by NSF Grant IIS 1018607 HCC-Small [51].Our lab is within the Center for Human-Computer Interaction (CHCI) [2] and the supportand resources of this infrastructure especially the “Black Lab” resources [1] is critical to ourwork’s success. Finally, the subject pool maintained by the Virginia Tech Department ofPsychology [3] drastically facilitates participant recruitment. Peter Radics, also of CHCI, isa no-nonsense reviewer whose opinionated maxims improved this writing.

Aside from the many named and unnamed colleagues who helped me with this work di-rectly and indirectly, many of my framily members reviewed drafts, discussed questions, orencouraged my progress on this project: Emily Stewart, Siroberto Scerbo, Justis Peters,

iv

Page 5: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Nathaniel “Nathan” Hammond, Olando “Shun” Coaster, Patricia Stewart, Phillip Stewart,Sr., Nancy Edwards, Ransom “Randy” Zartman, Charles “Charley” Mays, and Janet “Jan”Mays. Thank y’all.

As I entered the data analysis phase of this project, I had the great pleasure of persistentassistance from a post-baccalaureate researcher. In order to help with my work, he underwenttraining in ethics in research and data analysis. He stayed up late to work alongside me,read drafts, think of synonyms and to be otherwise generally helpful. I do not deserve thefriendship or level of support that that Christopher “Chris” Frisina continues to give me. Ishall endeavor to be the same kind of friend and colleague to others. Chris is my hero.

Finally I am grateful to the many participants who gave an hour or so of their life to thiswork so that we might be able to understand a piece of the world a little better.

v

Page 6: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Contents

1 Introduction 1

2 Related Work 3

2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

2.2 Group Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

2.3 Brainstorming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2.4 Creativity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.5 Participatory Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2.6 Privacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

3 Group Design Study 11

3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3.2 Construction of the Situation . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3.3 Study Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

3.4 Description of Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

3.5 Data Collection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

3.5.1 Data Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

3.6 Technologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

3.7 Description of Population . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

3.7.1 Recruitment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

3.7.1.1 Incentives . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

vi

Page 7: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

4 Results 23

4.1 Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

4.1.1 Post-Session Questionnaire Results . . . . . . . . . . . . . . . . . . . 23

4.1.1.1 Demographics . . . . . . . . . . . . . . . . . . . . . . . . . . 24

4.1.1.2 User Experiences . . . . . . . . . . . . . . . . . . . . . . . . 24

4.1.2 Video, Transcripts, and Codes . . . . . . . . . . . . . . . . . . . . . . 32

5 Conclusion 35

5.1 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

5.1.1 Group Work, Brainstorming, and Creativity . . . . . . . . . . . . . . 36

5.1.2 Privacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

5.1.3 Coordination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

5.1.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

5.1.4.1 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

5.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

5.2.1 Quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

5.2.2 Analysis of Variance of Interaction of Participants by Condition . . . 39

5.2.3 Extension of Study with Modified Interface . . . . . . . . . . . . . . . 40

Bibliography 42

Appendix A Questionnaire 47

Appendix B Guidelines for Researchers Instructing Participants 55

B.1 Description of Study to Participants . . . . . . . . . . . . . . . . . . . . . . . 55

vii

Page 8: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

List of Figures

3.1 The starting view of the user game software. . . . . . . . . . . . . . . . . . . 12

3.2 Layout of the Study Room: depicts the three positions of the participants andtheir laptop computers around the table (A-C), the cameras aimed at them,and the position of the researcher (R). . . . . . . . . . . . . . . . . . . . . . 14

3.3 The view of the user game software while a session is underway. . . . . . . . 15

3.4 Self-Reported Age of Participants . . . . . . . . . . . . . . . . . . . . . . . . 17

3.5 Self-Reported Academic Year of Participants . . . . . . . . . . . . . . . . . . 18

3.6 Self-Reported Gender of Participants . . . . . . . . . . . . . . . . . . . . . . 18

3.7 Self-Reported Major of Participants . . . . . . . . . . . . . . . . . . . . . . . 19

3.8 Prop to assist the researcher in explaining the user game software to theparticipants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

3.9 Participants playing the User Game . . . . . . . . . . . . . . . . . . . . . . . 21

3.10 Participants playing the User Game . . . . . . . . . . . . . . . . . . . . . . . 22

4.1 Sixty-two out of sixty-nine participants reported English as their first language. 24

4.2 Nationality and ethnicity of participants. . . . . . . . . . . . . . . . . . . . . 25

4.3 Means by transparency (condition) and card deck (dining vs. study) for “Ontheir turn, to what extent did others build on your ideas?” . . . . . . . . . . 26

4.4 Means by transparency (condition) and card deck (dining vs. study) for “Howoften did other people’s ideas capture your imagination?” . . . . . . . . . . . 30

4.5 Means by transparency (condition) and card deck (dining vs. study) for “Onyour turn, how often did you build on other people’s ideas out loud?” . . . . 31

4.6 Mean number of contributions of each condition, by card deck. . . . . . . . . 34

viii

Page 9: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

5.1 Degrees of transparency between conditions. On the left: the less transparentcondition, on the right, the more transparent condition. Red represents littleor no transparency, yellow indicates some transparency, and green denotesmost transparency. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

5.2 Degrees of transparency between conditions. The left two conditions showthe conditions from the current study, the right-most condition is proposed asfuture work. Red represents little or no transparency, yellow indicates sometransparency, and green denotes most transparency. . . . . . . . . . . . . . . 41

ix

Page 10: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

List of Tables

3.1 Self-Reported Age of Participants . . . . . . . . . . . . . . . . . . . . . . . . 17

3.2 Self-Reported Academic Year of Participants . . . . . . . . . . . . . . . . . . 18

3.3 Self-Reported Gender of Participants . . . . . . . . . . . . . . . . . . . . . . 19

4.1 Means for Post-Session Self-Report Questions Concerned with Ease of IdeaGeneration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

4.2 Means for Post-Session Self-Report Questions Concerned with Social Inhibition 27

4.3 Means for Post-Session Self-Report Questions Concerned with Quality . . . . 27

4.4 Means for Post-Session Self-Report Questions Concerned with Collaboration 28

4.5 Textual examples from the transcripts for each of the codes. . . . . . . . . . 33

5.1 Examples of Log Data and their Interpretations . . . . . . . . . . . . . . . . 40

x

Page 11: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Chapter 1

Introduction

Collaboration, cooperation, and other work in groups has helped humanity accomplish greatthings. Whether or not the participants are willing, the results of group work can still beincredible. Nations, economies, and wonders of the world have been built by people workingtogether. Conversely there are times when people come together and are, confusingly, unableto accomplish their goal, despite what appears to be sufficient resources and willingness.Why is it that some groups succeed, sometimes in contradiction to intuitive predictions, whileother groups, whose success may even seem guaranteed, fail? What are the factors that affectvarious metrics of group performance such as enjoyment, quality of work accomplished, andquantity of work accomplished? The potential for such a wide range of success and failure,and the complexity of the problem of identifying factors impacting success, failure, and otheroutcomes make group work an intriguing area of study.

Additionally, much of our work, play, indeed much of our lives, are currently lived in orthrough our technology. As various activities transfer from the physical world to the virtual,there are many differences or at least changes. Some of these are frequently touted as thebenefits of computation. To give three simple, ubiquitous examples, we can now work athome, on the train ride home, or at the beach.

The field of Computer Supported Cooperative Work focuses on the juncture of these twoareas: how group work is mediated by means of technology. When technology is included inthe picture, we find circumstances that could not have existed previously, and new questionsthat may require different sets of methods. Some familiar concepts from group work may bedifferent in the presence of technology and new categorizations or definitions are necessaryto explain patterns of collaboration or more general interaction that were invariant beforetechnology enabled these new patterns.

Many technologically-mediated scenarios threaten to change the old notions, assumptionsand balances that regulate human behavior and experience. In particular, technology allowsvery different relationships to the notion of privacy than existed previously. This is true

1

Page 12: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 1 2

in a kind of brute force way; in a time where many people carry smartphones with themeverywhere, and in many more settings than could have been possible without the technology,when they feel comfortable interacting with their social network through these devices inpublic, they may inadvertently broadcast private information in public spaces or fail toattend to the demands of the public space they occupy physically. And, when these phonesare filled with not only their own personal information, but personal information aboutmany of their family members, information that may have seemed quite private prior tothe ubiquity of smartphones, can now be found flouted on our screens in “public.” In fact,the terms “public” and “private” are now overloaded and used to describe situations notpreviously imagined. For example, where “sharing” was previously used, in a world beforeindividuals saw information as a resource, to refer to allocating one’s resources to others,“shared” may now be seen as the opposite of private. A specific example of this can beseen in social networking applications where manging one’s privacy settings actually meansdeciding with whom to share information. Indeed we may be coming to see our informationas a collection of assets, resources to be protected, bargained, and exchanged.

This is a somewhat critical view of the situation created by technology, but the currenttechnological situation is not all about confusion; there are positive sides as well. Onepotential is to optimize human behavior by creating situations with new blends of publicand private interactions.

In the current study, I create two novel blends of public and private spaces on laptop comput-ers and investigate the effects of the privacy options on group and personal idea generationaround a specific task. I investigated some new options through a study of collocated,technologically-mediated group work, and report on my findings about human behavior andexperience. In my Design Game Study, I asked participants, in groups of three, to play asoftware version of a design game modeled after Brandt and Messeter’s User Game [7] tofacilitate their design of a campus facility.

Page 13: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Chapter 2

Related Work

2.1 Introduction

Idea generation can be successful in groups or in solitude, without much technological sup-port, or with custom-built systems. For my interest of the meaning of technology in groupand solitary ideation, I have found a number of areas of prior work that support my inquiry.

Prior work on Group Work includes examinations of the parameters of effective group workas measured by several factors such as efficiency and satisfaction. Relevant Brainstormingliterature reports on the successes of brainstorming in groups versus individually, and withor without various supporting technologies. From the Creativity literature I recount thedefinitions some use for creativity and how it might be different in a group, and addition-ally factors that may support this creativity. In Participatory Design, I find similar workexploring the relationship between concerns of individual members and their participationin a group. Finally, from work on Privacy I use definitions of privacy and solitude as well asexplanations of people’s preferences for various kinds of privacy.

2.2 Group Work

There is a long tradition of work examining the efficacy of group work. Some of the old-est research, starting with Triplett in psychology [54], concerns the effects of the group onindividual performance. Work in this area pits creatures (people, rats, even cockroaches)against one another in tasks such as running mazes, pulling ropes and learning mazes. Some-times, performance of the individual alone wins out over performance of the individual inthe presence of others, and sometimes performance in social settings wins out over individualperformance. Factors influencing whether social facilitation (increased performance in thepresence of others) or social loafing (decreased performance in the presence of others) occurs

3

Page 14: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 2 4

include individual accountability, physiological arousal and task complexity [62, 14]. Thatis, performance on complex tasks is facilitated by low accountability and low physiologicalarousal in the presence of others, while performance on simple tasks is facilitated by highaccountability and high physiological arousal in the presence of others. Thus performance onan already learned task such as Olympic track pursuit bicycling is enhanced by the presenceof the competitor on the track. However, all other things being equal, learning to bicyclewould not be enhanced by the presence of others [4]. The thought is that the presenceof others in conditions of high arousal and accountability encourages the emergence of theperson’s dominant response and discourages the range of responses necessary to engage inlearning [25].

Given that even cockroaches show these effects, they could be of some concern in studying(human) group work. But human collaboration and cooperation is more complex. Rawperformance metrics may not capture cognitive and emotional variables that influence theperson at a higher level than is directly observable, nor is it always obvious what constitutesa previously learned task or what keeps people’s attention on a particular task.

Many years of research in sociology, psychology and organizational behavior have been de-voted to questions of how groups work and how they can work better. From this point ofview as well as from the earlier social facilitation/loafing perspective, more is often better,but not always. For example, from software engineering research, we have Brooks’s semi-nal work, The Mythical Man Month in which he demonstrates how adding people to a lateproject only makes it later [8]. Research in psychology, sociology and organizational behaviortell a different, more complex story; a number of different variables (e.g. evaluative tone,anonymity, social inhibition) emerge as important. One question of paramount importancehas been how we can encourage people to generate the best possible and widest range ofideas. Early ideas about how to encourage the creation and identification of new ideas arosein the context of advertising, with the invention of the brainstorming process.

One area of active inquiry in group work is concerned with the relative effectiveness ofwork in groups versus individuals working alone. For many, our intuition suggests that“two heads are better than one.” However, the research does not consistently support thissimple expectation. For example Warr and O’Neill, “present a theoretical account of whysocial creativity should in principle be more productive than individual creativity,” butthen they go on to, “explain findings to the contrary” [59]. Indeed, others have seen forvarious tasks that groups are less effective. Taylor found that “group participation whenusing brainstorming inhibits creative thinking” with the group participants producing fewerideas per person, fewer unique ideas per person, and ideas of less approximated quality thanindividuals [52, p. 23]. Diehl and Stroebe work to shed some light on these contradictions ofintuition. They found production blocking to be largely responsible for productivity lossesin real groups [19]. Lamm and Trommsdorff had similarly found that, for tasks requiringideational proficiency, groups were less effective than individuals, and they hypothesized thatthis might be explained in part by social inhibition [29]. Continuing in their pursuit, Diehland Stroebe later found that groups were less effective than individuals for idea generation,

Page 15: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 2 5

but importantly, not because of having more time to filter their own ideas (social inhibition,as hypothesized by Lamm and Trommsdroff [29]), but rather because of the distractioncreated by the group setting [20]. In continued work on produciton blocking, Nijstad et.al found more nuanced effects and mechanisms of production blocking, such as that thegenreation of semantically related ideas was more disrupted, and that part of the problemdescribed as produtcion blocking was participants’ need to monitor delay, to know when itwould be their turn [33]. As turn-taking was a part of the design of the current study, theseresults may provide a context for understanding the results.

Aside from this work on the factors mediating the performance of groups, Paulus et al. stud-ied the effect of “social cohesion” (the hypothesized enhanced performance by an acquaintedgroup) on a group’s idea generation, and were unable to find a statistically significant effect[38].

Attempts to improve group performance borrow from the theory of Distributed Cognition,the importance of external memory. Pissarra et al. fount that access to external memoryimproved group performance, and interestingly, that groups that did not have to combineor merge their ideas with each others performed worse than those who did [42]. The designof the software for the current study incorporated these findings by having a staging area inwhich participants could externalize their plans for their turn.

Researchers measure group performance in several different ways including measures of pro-ductivity, efficiency, and participant satisfaction. Researchers examine several phenomena,such as Warr and O’Neil’s three-factor hypothesis that “production blocking, evaluationapprehension and free riding [social loafing]” [59], might modulate these metrics.

2.3 Brainstorming

In addition to these questions, Warr and O’Neill also pose our research question, “whatare the effects of public, social and private interaction on creativity?” [59]. I examine thisquestion in a brainstorming context. First proposed by Osborn, brainstorming is a techniquethat is intended to facilitate idea generation in groups. The idea is that people sit togetherin the same room and generate verbal ideas about a problem or project freely. There are fourrules: 1) encourage quantity, 2) refrain from judgment, 3) encourage the unusual and 4) buildon other people’s ideas. These are intended to encourage the group to produce more andbetter ideas [35]. The intuition is that better outcomes will be produced simply through thegeneration of a higher quantity of more varied ideas. The claim is that following a techniquewill result in a higher number of quality ideas than would be produced by the same numberof individuals generating ideas in isolation. Since its publication in the 1950’s, brainstormingtechniques have been adopted in a wide variety of circumstances in and outside the workplace,in schools and in public discourse. Different group sizes and constitutions have been used aswell as wide variations in task. So too have been variations on the basic concept including the

Page 16: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 2 6

creation of phases, and the provision of shared or private external memory devices. Popularbooks give advice to businessmen, teachers and the general public.

Brainstorming is a perfect setting for studying War and O’Neill’s (aforementioned) question,“what are the effects of public, social and private interaction on creativity?” [59]. However,in studying brainstorming, there are many factors to consider. In addition to the aforemen-tioned work comparing group work to individual, there is some work that has found thatfor certain circumstances for idea generation, groups can outperform individuals. Gallupeet al. found that with the use of an electronic brainstorming system, groups produced more“nonredundant ideas” than did groups without [22]. Later in this vein of work, Dennisand Valacich found that groups utilizing computerized support for brainstorming generatedmore ideas than did individuals [17]. As many researchers have turned to technology such aselectronic brainstorming systems to mitigate some negative effects of group work, we mustask, what might then be the negative or side-effects of the introduction of technology to thiscomplex problem and social space?

Further enhancing the brainstorming experience, and contending with “evaluation appre-hension,” which would become a component of Warr and O’Neill’s three-factor hypothesis(including also production blocking and social loafing) [59], Connolly et al. found that mem-bers of anonymous, critical groups performed best, indicating that social inhibition, mayindeed be a factor mediating performance of idea generating groups [13]. To incorporatethese findings our CSCW application was designed with varied levels of transparency orprivacy that were intended to vary this anonymity.

2.4 Creativity

In our interest to explore “micro-coordination,” the relationship between “micro-level, sit-uated actions and broader outcomes,” [30] I chose creativity as a candidate for a possiblebroader outcome of our intervention. However, creativity is a broad term and its use in somany fields makes it hard to define for specific purposes. As reported above, creativity isseen as a metric of success for idea generation and is an area of interest in research (e.g. [59]).However, to use creativity as a metric, for example of performance of ideation in groups, wemust have some definition of creativity that facilitates measurement. Paulus [36] builds onBrown et al.’s definition of ideational fluency (“the probability of generating an idea in thenext unit time interval is relatively high” [9]), to synthesize his definition of group creativity,“divergent thinking in groups as reflected in ideational fluency.”

Page 17: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 2 7

2.5 Participatory Design

Among many other areas, one that has studied the intersection between creativity and groupwork from a perspective including those broader concerns relevant to micro-coordination isParticipatory Design. Developing from work with labor movements, particularly in Scan-dinavia, and the planning movement in the United States [27, 5], Participatory Design isa field of inquiry that studies (1) how to include participants, or, in the words of conven-tional researchers, “subjects” in the design process, and (2) the effects of including them.In Participatory Design, productivity in group ideation might often be considered secondaryto other metrics. For example, researchers in Participatory Design might consider whethertheir participants enjoyed the ideation session, whether they felt increased ownership of theprocess, artifacts, and eventual product, and whether the researchers felt they better under-stood the participants. These qualitative metrics are more important than the quantitativemetrics that have been the focus of the brainstorming studies [61].

From a psychological point of view, the concept of group ideation is complex. That is, groupideation is influenced by many factors such as group cohesion, production blocking and socialinhibition. However, experimental scenarios control for exogenous factors which are consid-ered central from the point of view of Participatory Design. Philosophically, ParticipatoryDesign is tied to Activity Theory, with its underlying concern with systems as consistingof tensions between important components, such as objects, subjects, and rules. Pragmati-cally, researchers and/or designers working with participants must contend with issues suchas the challenges of trans- and interdisciplinary perspectives and incommensurate power ofthe participants.

One line of research in Participatory Design has been methods for improving group ideationamong participants. Brandt and Messeter contribute Design Games as a possible method forimproving these group design process in Participatory Design projects [7]. They observe that“[p]laying games and designing are both social enterprises, evolve over time and are basedon a set of rules.” To take advantage of this observation, they designed games with theoverall aim, “to provide multiple stakeholders with means for developing, negotiating andexpressing a shared understanding of users, use contexts and technology as part of conceptdesign activities” [7]. Hornecker reports her findings with users using a similar, “approachfor structuring idea generation that supports the free flow of ideas” [24]. Following fromthese, I designed the two conditions of software for my study, based on the User Game asdetailed by Brandt and Messeter [7].

2.6 Privacy

Following on Connolly’s results related to anonymity and evaluative tone [13], the currentstudy used a kind of privacy as the independent variable to determine whether there would

Page 18: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 2 8

be any broader effects from its variance between subjects. Privacy is an overloaded word; Iseek here to clarify the conceptualization in the current study.

Privacy is important. Since 1948 the United Nations has recognized it as a fundamentalHuman Right [56]. As technology advances, first with the adoption of the Internet, andnow particularly as portable devices with Internet connectivity and sensors such as camerasand microphones, have become more common, notions relating to privacy have become morecomplex. Decew helps us see one way we have taken the notion of privacy and interpretedit in the context of our technological society. She explains how under certain circumstances,we interpret control of information as privacy.

“People have many different reasons for wanting to control information aboutthemselves, motives ranging from freedom from defamation to commercial gain.When freedom from scrutiny, embarrassment, judgment, and even ridicule areat stake, as well as protection from pressure to conform, prejudice, emotionaldistress, and the losses in self-esteem, opportunities, or finances arising fromthese harms, we are more inclined to view the claim to control information as aprivacy claim” [16].

Jones and Teevan describe some of this complexity in terms of a person’s PSI or PersonalSpace of Information and the many silos this includes [28]. Some questions that arise todaymight be: is the text (SMS) “conversation” a person has with one friend while at lunchwith another private? Is the lunch the two friends are having (while one is sending SMSmessages yet another friend) private? Similar questions have been asked in the workplacewith colleagues sitting just a few feet apart with at most a cubicle wall to separate them.Computer Supported Cooperative Work literature also raises similar questions where thevery systems that are developed and studied must make decisions about how any notionof “privacy” should be implemented. Yet another perspective from which privacy has beenstudied is public policy. Shapiro and Baker remind us well that, “[t]he capabilities of infor-mation technology and the erosion or protection of information privacy are not the inevitableconsequences of systems evolution but instead are produced by specific people with particu-lar interests” [47]. And importantly, “[j]ust as information technology can erode privacy, itcan also enhance privacy” [47].

Because of its long history as a concept and the evolving scenarios in which it is used andstudied, it would be prudent to review some of the work and to scope which aspects ofprivacy pertain to my work. I found Carew and Stapleton’s survey of literature relevant toprivacy for information systems design quite helpful. They review work defining privacy andits use in both theoretical and more practical applications [11]. Further, they point us to avery helpful taxonomy of privacy developed over many years and reported between severalof Pedersen’s publications.

First, extending work by Westin in which he proposes solitude, isolation, anonymity, andintimacy as the four types of privacy [60], Pedersen proposed adding reserve and further

Page 19: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 2 9

decomposing intimacy into that with family and that with friends, resulting in six types ofprivacy [39]. Later Pedersen investigated the psychological needs fulfilled by each of thesesix types of privacy, and found a number of these, including that people seeking to utilizea particular function would seek a different of the aforementioned six types of privacy [40].These functions were further explored in still another study by Pedersen to determine thedegree to which each of five of the functions studied in [40] (autonomy, confiding, rejuvena-tion, contemplation, and creativity) were fulfilled by each of the six types of privacy [41].The result of this work is a model of privacy useful for both theoretical and applied research.From their work in developing this model, we learn that out of all of the six types of privacy,participants indicated that “solitude (freedom from observation by others)” most increasedthe privacy function of creativity [41].

More applied research studying the effects of privacy on group processes has resulted insome interesting findings that I take into account in my work. Connolly et al. studiedthe Effects of Anonymity and Evaluative Tone on Idea Generation in Computer-MediatedGroups. While they found little effect on solution quality and solution rarity, they did findthat “[g]roups working anonymously and with a critical confederate produced the greatestnumber of solutions and overall comments” [13]. Conversely, and in line with later findings byPaulus et al. [37], they found that, “[i]dentified groups working with a supportive confederatewere the most satisfied and had the highest levels of perceived effectiveness, but producedthe fewest original solutions and overall comments” [13]. Other research also points towarda suspicion that privacy affects creativity.

Throughout the literature we have seen references to the potential importance of privacyto idea generation in groups. For example, Warr and O’Neill ask, “what are the effects ofpublic, social and private interaction on creativity?” [59] and Connolly et al. found thatmembers of anonymous critical groups performed best in their study [13].

2.7 Summary

As we have seen, group work is an important, and long-studied area of inquiry. Its complex-ities have been studied from many different disciplines, with their varying epistemologiesand methodologies. Because of the amount of group work that is done, and the results ofthese endeavors, we continue to study it. We have also seen that one task that is popular toattempt in groups is idea generation, and a popular method for generating ideas in groupsis brainstorming, despite questions about its efficacy. Some work in Participatory Design,such as that on Design Games, aims to address some of the challenges of being productivegenerating ideas in groups. Relevant to group work, creativity, and more, privacy is alsoimportant to group work. It has new meaning and relevance in this era of ubiquitous mobiletechnologies.

My interest in the state of these fields and in the pursuit of investigating micro-coordination

Page 20: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 2 10

phenomena led me to design and run a study seeking to incorporate and eventually extendthis knowledge.

Page 21: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Chapter 3

Group Design Study

3.1 Introduction

The Design Game Study was designed to investigate a group design task mediated by acomputer-supported cooperative work (CSCW) application. CSCW applications are de-ployed in many configurations, but my study centered on investigating the situation in whichparticipants were collocated, each using a separate laptop. The question under considerationwas whether and how different degrees of information sharing changed the generativity andexperience of the group and individuals in the group. On the one hand, group work is oftenthought to be more productive than solitary work. On the other, a long tradition of inquirysuggests that group performance can suffer under certain circumstances. Brainstormingis a commonly employed strategy for group work. Design firms frequently use structuredbrainstorming techniques in idea generation, often with support from props. More moderninvestigations of group processes suggest that some practices produce better brainstormingthan others. However, no one has investigated whether configurations of private as comparedto public screen space interacts with group or individual productivity and experience.

3.2 Construction of the Situation

My study utilizes an interpersonal situation described by Brandt and Messeter in whichparticipants are asked to generate design ideas by picking relevant pictures from a heap,putting them on a playing board in juxtaposition to other people’s pictures, and telling astory about design qualities anchored by associations with the new pictures [7]. Their “UserGame” was designed to be played without benefit of technology by a group of would-bedesigners in the same room and was presented for its promotion of participatory designpractices.

11

Page 22: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 3 12

Figure 3.1: The starting view of the user game software.

I created two variations of the User Game as software running on laptop computers. In bothversions, the window was vertically divided into two regions (as in 3.1). The top half of thewindow constituted the “playing area” and the bottom half was the “staging area.” Theplaying area constituted a shared resource in that any changes made to it by any participantwere immediately broadcast to all other participants. Additionally, any participant couldreposition any picture. In the Shared Planning condition, sharing in the staging area workedthe same as in the playing area. That is, any changes any participant made in the positionof pictures within the staging area was immediately shared to all other participants. Allparticipants could move any picture at any time. However, the Private Planning conditionworked differently. Changes that a participant made within the staging area were visibleonly to the person who made the change on his/her local machine. In this sense, the stagingarea and its resources (the pictures and their arrangements) were private.

I asked participants to play the game with each other via individual laptop. So in eachsession there were three participants and I asked them each to sit in one of three chairsaround a circular table. At each of these seats there was a laptop. The game was maximizedon their screen. At the beginning of the game, all of the images were in the staging area. Allof the participants could move the images around the staging area by clicking and dragging.In addition, they could drag images from the staging area to the playing area, or back tothe staging area from the playing area. In an effort to closely model my software version of

Page 23: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 3 13

the user game after Brandt and Messeter’s physical (paper) version, the images in the gamewere treated as “real” photos, in that there was only one copy. So while this copy was visibleon all three screens, once a player moved the image to the playing area, it was no longeravailable in anyone’s staging area (but instead now present in everyone’s playing area).

3.3 Study Design

The study contrasted two conditions, one in which participants had a private space (thePrivate Planning condition) and one in which all spaces were shared (the Shared Planningcondition). Groups of three engaged in a structured brainstorming activity in one of thesetwo conditions. Half the groups used one set of prompting pictures associated with therequest to design a campus dining hall. The other half used a different set of promptingpictures associated with the request to design a campus study lounge. This “transparency,”deciding whether the staging area was private or shared is my independent variable.

3.4 Description of Procedures

Participants met the researchers in the green room where they made acquaintance, anddiscussed the study and informed consent. After completing the informed consent process,the participants entered the study room. The study room was a small room approximatelyten feet square. One wall of the study room was a half-silvered window. Beyond whichparticipants were told (honestly) there was a camera positioned to record the study from awider angle than was possible from within the study room. In the center of the room therewas a circular table with three fifteen inch laptops. The laptops were spaced equally aroundthe circumference of the table. At each laptop position, there was a chair for one participant.In one corner of the room there was a small desk where the researcher sat during the study.Around the edge of the room there were three cameras. Each directed at the position at thetable farthest from it. The general layout of the study room is illustrated in Figure 3.2.

Upon entering, the participants were asked to select a chair at the table to sit for theduration of the study. Next the researcher explained the user game and its interface tothe participants. The participants were instructed as a group as they sat around the tabletogether in the room. Half of the sessions were told that they would be designing a dining hall(for a college campus), while the other half of the sessions were told that they should designa campus study lounge. The participants were told that they would be taking turns. Nextthe researcher explained that on the first person’s first turn, they would select five imagesfrom the heap in the staging area at the bottom of their screen that they felt told a storyabout a person who they imagined would use their venue (dining hall or study lounge). Theresearcher next explained that after the first person’s first turn, all the turns were structuredin the same way, but that it was just slightly different than the first person’s first turn.

Page 24: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 3 14

A

B

C

R

Figure 3.2: Layout of the Study Room: depicts the three positions of the participants andtheir laptop computers around the table (A-C), the cameras aimed at them, and the positionof the researcher (R).

Page 25: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 3 15

Figure 3.3: The view of the user game software while a session is underway.

On these subsequent turns, the participant would select two to four images from the heapin the staging area, and they would choose them such that they felt they told a story abouta person they imagined would use their venue. However, on these subsequent turns, theparticipant would need to add the images they chose to an image that had been placed inthe playing area on someone’s previous turn. Additionally, the participant was asked toinclude that existing image in their new story (see 3.3 for an image of the software duringplay). The researcher also explained that before they indicated the end of their turn to theother participants, they should explain or tell the story of the pictures they chose. Theparticipants were told to share their story either as they were dragging their images to theplaying area, or after they finished moving the pictures for their turn.

The researcher explained that the participants need not exhaust their supply of images fromthe staging area in creating their stories. Instead, they should decide when they feel nothingnew would be contributed from remaining pictures, and then seek consensus from the restof the participants. Having reached consensus that they wished to add no further images,they were to tell the researcher they were done. Finally, the Researcher told the participantsthat they could ask the researcher questions at any point. For reference, I have included theguidelines given to the researchers for how to instruct the participants (see: Appendix B).In order to assist the researcher in describing how to use the interface, I created a prop (anillustration of the interface on a piece of poster board) similar to Figure 3.8.

Page 26: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 3 16

3.5 Data Collection

I collected data by a number of methods and media. With the help of my undergraduateresearch assistants, I recorded each seat at the table from a separate camera (as depictedin Figure 3.2), and recorded all three from a camera positioned in the adjacent room for awider angle. I also recorded screen captures from each computer using Camstudio [10].

In addition, at the end of the sessions, participants completed a questionnaire. I administeredthis post-session questionnaire through a Virginia Tech Information Technology Security-approved service named Qualtrics Surveys [43]. The majority of the post-session question-naire was composed of several questions in the form choosing a position on a Likert scale[31]. In composing the questions for this questionnaire, I was interested in topics relatingto four themes: the participants’ perceived ease of generating ideas, their perception of thequality of ideas, participants’ experience of social inhibition during the idea generation pro-cess, and their perception of the amount and quality of the collaboration with the others.These themes are evident in the text of this questionnaire, which can be found in AppendixA.

3.5.1 Data Management

The screen captures were stored for some time on the study laptop on which they wererecorded. These laptops were password protected, and stored in the study room, which waslocked whenever the researchers were not present. The video from the cameras was recordedto nNovia digital video recorders (one for each of the four cameras) [15]. These recordersare essentially hard drives. At the end of the day, these hard drives were taken to our labat 1117 KnowledgeWorks 2 in the Virginia Tech Corporate Research Center [57]. Our lab islocked at all times. In order to free capacity on the nNovia boxes at the end of a day, priorto the next day of studies, I transfered the video data from the nNovia boxes to two externalhard drives (which were kept locked in a file cabinet in the lab). One of these external driveswas treated as a mirror of the other. On these drives, I created a TrueCrypt file container[55] to store the data in an encrypted location.

3.6 Technologies

Our versions of the User Game are implemented in Java [50]. In order to manage sharedresources, I used an implementation of TupleSpaces [26]. To build the custom user interface,I used Java’s Swing API [34] to render the two regions of the screen and the draggableimages. The software was built to run on laptops running the Windows XP Professionaloperating system with Service Pack 3 installed. Additionally each laptop had the Java 2Runtime Environment version 1.4.2 (JRE 1.4.2). As screen real estate was important to

Page 27: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 3 17

Figure 3.4: Self-Reported Age of Participants

Years of Age Number of Participants18 1819 2520 1121 922 6

Table 3.1: Self-Reported Age of Participants

our software, and has been shown to affect individuals’ attention and arousal [45], I usedlaptops with large, fifteen-inch monitors, and relatively high resolution of fourteen-forty bynine hundred pixels. For reference, the source code for my implementations of the user gameis available online [58].

3.7 Description of Population

The participants in the study were undergraduate students from Virginia Tech. They wereenrolled in a Psychology class during the semester of the study. I recruited a total of sixty-nine participants. The participants were between eighteen and twenty-two years of age,with a mean age of 19.4. I offered the participants the choices of “first year,” “sophomore,”“junior,” “senior,” or “graduate” for their academic year. Thirty-seven reported themselvesto be in their first year. Participants were able to report their gender as female, male, orother. Thirty-one participants reported their gender as female, and thirty-eight reported itas male (none reported a gender other than female or male).

Page 28: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 3 18

Figure 3.5: Self-Reported Academic Year of Participants

Figure 3.6: Self-Reported Gender of Participants

Academic Year Number of ParticipantsFirst year 37Sophomore 14Junior 8Senior 10Graduate 0

Table 3.2: Self-Reported Academic Year of Participants

Page 29: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 3 19

Figure 3.7: Self-Reported Major of Participants

Gender Number of ParticipantsFemale 31Male 38Other 0

Table 3.3: Self-Reported Gender of Participants

Page 30: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 3 20

3.7.1 Recruitment

Virginia Tech’s Psychology Department has a subject pool management system, called“SONA,” (after the company who provides the service, Sona Systems) through which stu-dents can elect to participate in studies on-going at the University. I recruited participantsthrough this SONA system.

3.7.1.1 Incentives

The students in the SONA system will receive extra credit in a Psychology class they areenrolled in for their participation in a certain number of hours of studies. The SONA sys-tem can award a variable number of credits to students who participate in studies. ThePsychology Department asks that researchers who wish to utilize the SONA system predictthe amount of time necessary to complete participation, rounding any fraction of an hourup to the nearest hour. Due to the time necessary to receive the three participants, provideinstruction, explain the interface, perform the task, and finally to answer a questionnaire,I predicted that it may take up to ninety minutes for participants to complete participa-tion. Thus in SONA participants could see that they would receive two credits from theirparticipation in my study.

Page 31: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 3 21

Figure 3.8: Prop to assist the researcher in explaining the user game software to the partic-ipants

Figure 3.9: Participants playing the User Game

Page 32: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 3 22

Figure 3.10: Participants playing the User Game

Page 33: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Chapter 4

Results

For this study I had the great fortune to have so many (sixty-nine) subjects participate(three at a time) in a total of twenty-three sessions. The large number of sessions, combinedwith the numerous methods of data collection, resulted in an abundance of data. The resultscame from my data analysis, which can be divided into two phases.

4.1 Data Analysis

I used several tools to analyze my data. Among these are Microsoft Excel [32], JMP [46], andR [44]. The first phase of my data analysis process was to collect the post-study questionnaireresults from qualtrics [43]. I first compiled some descriptive statistics which describe the somefurther demographics of my participants. Next, I compiled some descriptive statistics whichportray the distribution of responses on the likert questions in the questionnaire. Then, Iused inferential statistical methods to test the likert data in search of variances that mightbe attributable to my independent variable: transparency.

After the post-study questionnaire results, I analyzed the video data collected from thesessions. The analysis of this video data was relatively complex, and is described at lengthin 4.1.2.

4.1.1 Post-Session Questionnaire Results

I designed the post-study questionnaire to gather some demographic information about theparticipants (see 4.1.1.1), but also to collect information about the participants’ experiencesinteracting with each other and our software implementation of the User Game (see 4.1.1.2).This questionnaire is included in Appendix A for reference.

23

Page 34: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 4 24

Figure 4.1: Sixty-two out of sixty-nine participants reported English as their first language.

4.1.1.1 Demographics

Some of the responses of these questionnaires have already been reported above in Tables3.1, 3.2, and 3.3. However, I also collected some other demographic data that I felt maybe relevant to the study. Because participants would be conversing, and telling narrativescorresponding to their pictures, I thought it prudent to report whether English was theirfirst language. Additionally, in their conversations, their shared cultural knowledge wasrelevant, so I also asked participants their ethnicities. As shown in Figures 4.1, 4.2a, and4.2b, there was little diversity in our participants, with an overwhelming majority of themspeaking English as their first language, being Caucasian, and being from the United Statesof America.

4.1.1.2 User Experiences

In the post-session questionnaire, the participants responded to up to twenty-six likert ques-tions covering the four themes mentioned in 3.5: Ease, Quality, Social Inhibition, and Col-laboration.

I calculated the average responses for each question from the three participants in eachsession. I then performed an Analysis of Variance for the questions to see whether thelevel of transparency predicted the response of the group on the question. Out of twenty-six questions, I found one to have a statistically significant variance, and two that weremarginally significant. In Tables 4.1, 4.3, 4.2, and 4.4, I present the means, by condition,for all of these questions, separated into separate tables depending on the primary themeaddressed.

For the question, “On their turn, to what extent did others build on your ideas?” I found thatthe response of the participants is correlated to the transparency with t(3) = −2.13, p < 0.05.

Page 35: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 4 25

(a) Self-reported nationality of participants.

(b) Self-reported ethnicity of participants.

Figure 4.2: Nationality and ethnicity of participants.

Page 36: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 4 26

Figure 4.3: Means by transparency (condition) and card deck (dining vs. study) for “Ontheir turn, to what extent did others build on your ideas?”

I graphed the means for this question in Figure 4.3. This means that participants in the LessTransparency condition felt that their group mates built on their ideas to a greater extentthan the participants in the More Transparency condition. So, participants were more likelyto feel that their colleagues built on their ideas if they could not see each other’s stagingarea.

Table 4.1: Means for Post-Session Self-Report Questions Concerned with Ease of Idea Gen-eration

More Transparent Less TransparentQuestion Dining Study Dining StudyHow easily were you able tocome up with ideas?

7.00 (0.894) 7.67 (0.972) 6.80 (1.52) 6.67 (0.745)

Continued on next page

Page 37: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 4 27

More Transparent Less TransparentQuestion Dining Study Dining StudyHow hard was it to comeup with ideas that you werewilling to share?

2.83 (1.22) 3.60 (1.64) 3.20 (1.35) 3.62 (1.16)

How invited did you feel topursue your associations?

6.89 (0.807) 7.13 (0.691) 7.13 (0.989) 6.10 (1.52)

Table 4.2: Means for Post-Session Self-Report Questions Concerned with Social Inhibition

More Transparent Less TransparentQuestion Dining Study Dining StudyWhether or not you actuallyshared it how willing wereyou to share the first thingyou thought of each turn?

7.61 (1.04) 8.07 (0.548) 7.73 (0.760) 6.95 (0.803)

How comfortable were youwhen sharing your ideas?

7.61 (1.27) 7.93 (0.548) 7.73 (1.32) 7.14 (0.573)

How comfortable were youwhen building on someoneelse’s idea?

7.44 (0.886) 7.73 (0.365) 7.80 (1.02) 1.29 (0.731)

On your turn, how often didyou share the first thing youhad thought of?

6.94 (1.14) 7.47 (0.730) 7.67 (0.471) 6.90 (0.763)

How free did you feel to pur-sue your own associations?

7.00 (0.869) 7.47 (1.07) 7.20 (1.15) 6.86 (1.03)

How constrained did youfeel from sharing your ownthoughts?

2.44 (0.886) 2.40 (0.894) 2.07 (0.723) 3.10 (1.33)

Table 4.3: Means for Post-Session Self-Report Questions Concerned with Quality

More Transparent Less TransparentQuestion Dining Study Dining StudyHow often did other people’sideas capture your imagina-tion?

6.33 (0.894) 6.00 (1.11) 6.80 (1.04) 7.19 (0.790)

How frequently did someoneelse’s idea give you an idea?

5.78 (0.689) 6.20 (1.32) 6.13 (1.82) 6.86 (0.325)

Continued on next page

Page 38: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 4 28

More Transparent Less TransparentQuestion Dining Study Dining StudyHow many of your ideas re-sulted in a negative reactionfrom your group?

1.78 (1.05) 1.47 (0.183) 1.13 (0.183) 1.86 (1.14)

How many of your ideas re-sulted in a positive reactionfrom your group?

6.28 (1.25) 6.73 (1.42) 6.53 (1.48) 6.24 (0.568)

How much did you careabout doing a good job per-sonally?

6.89 (0.544) 7.20 (0.989) 6.93 (0.830) 6.29 (1.30)

How much did you careabout your group doing agood job?

6.06 (0.712) 7.00 (1.03) 6.40 (2.01) 6.00 (1.20)

Table 4.4: Means for Post-Session Self-Report Questions Concerned with Collaboration

More Transparent Less TransparentQuestion Dining Study Dining Study† On your turn how often didyou build on other people’sideas out loud?

6.50 (1.33) 6.53 (0.869) 7.53 (1.22) 7.19 (0.813)

† How often did other peo-ple’s ideas capture yourimagination?

6.33 (0.894) 6.00 (1.11) 6.80 (1.04) 7.19 (0.790)

* On their turn to what ex-tent did others build on yourideas?

6.17 (1.03) 5.93 (1.30) 7.47 (1.28) 6.67 (0.981)

How often did someone sharean idea you were planning toshare before your turn cameto share?

1.72 (1.58) 3.20 (1.73) 2.80 (1.98) 2.33 (1.78)

How frequently did someoneelse’s idea give you an idea?

5.78 (0.689) 6.20 (1.32) 6.13 (1.82) 6.86 (0.325)

How comfortable were youwhen building on someoneelse’s idea?

7.44 (0.886) 7.73 (0.365) 7.80 (1.02) 1.29 (0.731)

How often did your group re-act strongly to your ideas?

5.44 (1.49) 5.73 (1.61) 4.60 (0.863) 5.29 (0.356)

Continued on next page

Page 39: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 4 29

More Transparent Less TransparentQuestion Dining Study Dining StudyHow many of your ideas re-sulted in a negative reactionfrom your group?

1.78 (1.05) 1.47 (0.183) 1.13 (0.183) 1.86 (1.14)

How many of your ideas re-sulted in a positive reactionfrom your group?

6.28 (1.25) 6.73 (1.42) 6.53 (1.48) 6.24 (0.568)

How influenced do you thinkyou were by others?

5.22 (0.886) 5.80 (1.02) 6.33 (1.27) 6.00 (0.638)

How often did you reactstrongly to something some-one else said?

3.44 (0.807) 4.67 (2.01) 4.67 (1.81) 5.43 (1.08)

To how many of your group’sideas did you react nega-tively?

2.00 (0.869) 1.60 (0.548) 1.07 (0.149) 1.76 (0.418)

To how many of your group’sideas did you react posi-tively?

7.17 (0.691) 7.20 (0.803) 7.20 (1.61) 6.90 (1.44)

How often did you expressyour strong reaction to some-one else’s idea in words?

3.39 (1.54) 4.20 (1.61) 3.87 (1.61) 4.19 (1.17)

How much did you careabout the other people inyour group?

5.56 (0.584) 5.73 (1.86) 5.73 (1.23) 5.05 (1.51)

How invited did you feel topursue others’ associations?

6.33 (1.01) 6.80 (0.447) 6.87 (0.767) 6.24 (0.810)

* Denotes statistical significance (p < 0.050)† denotes statistically marginal significance (p < 0.080).

The two questions with marginally significant results were “How often did other people’sideas capture your imagination?” correlated with transparency with a t(3) = −1.88, p <0.0760 and “On your turn, how often did you build on other people’s ideas out loud?”with a t(3) = −2.08, p < 0.0513 . The means for these are graphed in Figures 4.4 & 4.5,respectively. To interpret these results, note that participants in the Less Transparencycondition were more likely to have their imagination captured by their colleagues. And, thatparticipants in the Less Transparency were also more likely to build on their colleagues’ ideasout loud, than participants in the More Transparency condition.

From these three questions we see that, at least from their self-reports, participants in theLess Transparency condition were more likely to engage with their colleagues’ contributions

Page 40: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 4 30

Figure 4.4: Means by transparency (condition) and card deck (dining vs. study) for “Howoften did other people’s ideas capture your imagination?”

Page 41: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 4 31

Figure 4.5: Means by transparency (condition) and card deck (dining vs. study) for “Onyour turn, how often did you build on other people’s ideas out loud?”

Page 42: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 4 32

and to perceive their colleagues as collaborating with them. These data are consistent witheach other in suggesting that the Less Transparent condition better facilitates idea generationin groups.

4.1.2 Video, Transcripts, and Codes

As mentioned in the Description of Procedures (3.4) and depicted in Figure 3.2, I recordedvideo of each of the participants, and additionally, a wider angle through the window froman adjacent room. With the help of my undergraduate research assistants and interns, Itranscribed the videos from the study sessions, writing what each participant said for theduration of the session. I used a specialized transcription software product, InqScribe, tofacilitate the transcription process.

Having transcribed all of the videos, I noticed that there were several kinds of contributionsthat the participants made to the design. Overwhelmingly, participant contributions focusedon the features to be added to the study lounge (or dining hall, depending on which set of“cards” were given for their session). I developed an initial coding key with six categoriesbased on my observations. Three coders (including me) divided the transcripts, each workingon a subset of the total. The two undergraduate coders were trained by me sitting with themwhile they coded an initial transcript. Early on, I reviewed completed coded transcripts anddiscussed differences until convinced that the codes were applied correctly and consistently.

The failure to obtain consistent, agreed-upon codes in some areas led to the creation ofan “Other Features” category. However, three different kinds of features emerged withsatisfactory consistency: aesthetic, setting, and value. Aesthetic was used to label anycontribution that was wholly directed at the sense of beauty or taste of the potential userof the space being designed. These could include features that were visually, aurally, orgustatorily pleasing, for example. Setting was used to characterize contributions that focusedon the situational, temporal, or social setting for a feature, or a feature’s other preconditions.That is, a contribution was coded as having setting characteristics if it described the contexteither of a feature or of the environment that the person was trying to explain. Therewere also times when a participant indicated a feature that communicated a value held bythe participant-designer or credited by the participant-designer to their imagined users; wecoded these instances as Value. Other codes that were used during the coding process werenot coded with sufficient consistency and were therefore consolidated into the more generalOther Feature category. In Table 4.5 I have compiled a few examples of each of these.

One question I had was whether the experimental condition had any effect on the number ofcontributions individuals made to the design. To study this question, I examined the numberof contributions made by the individual participants by the condition in which they partici-pated. I tested the variance of the number of contributions made by each group against theircondition. That is, I performed an Analysis of Variance to test whether the level of trans-parency, or the card deck used for a given session could predict the number of contributions.

Page 43: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 4 33

Code ExamplesAesthetic “...homey accent...” “...a nice statue...” “Gotta appreciate your

art...”Setting “Oh, it’s the end of my

day, I’m going to go getsomething to eat...”

“...you’re walking out-side, ok, and you’rehungry so...”

“I’m walking to class,around Squires and mynext class doesn’t startfor 30 minutes...”

Value “it’s always good totake your trash awaybecause dining halls aremuch more enjoyablewhen they’re clean.”

“...but his girlfriendcame into the din-ing center, and heremembered she wasa vegetarian, and hewasn’t supposed to eatmeat around her, so hewent back to the saladbar to get a salad...”

“I don’t really likeworking while I’m eat-ing...”

Feature “... nice places for peo-ple to sit ...”

“a niceview so thatwhen they want to lookoutside during a breakor something they havesomething to look at.”

“They need lockers sothey can lock up theirstuff.”

Table 4.5: Textual examples from the transcripts for each of the codes.

Page 44: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 4 34

Figure 4.6: Mean number of contributions of each condition, by card deck.

The ANOVA indicated only marginal probability that the level of transparency predictedthe number of contributions with F (3, 19) = 1.48, p < 0.0773. In Figure 4.6, I have plottedthe mean number of contributions of each condition, by card deck. This data reveals thatparticipants in the More Transparent condition made more contributions than those in theLess Transparent condition.

Page 45: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Chapter 5

Conclusion

5.1 Discussion

The Group Design Study investigated the effects of differences in transparency on idea gen-eration in groups. I implemented Brandt and Messeter’s User Game [7] in software. I createdtwo versions of the software: one in which the staging area was shared between all of theparticipants (for the More Transparent condition), and the other in which each participanthad a private staging area (for the Less Transparent condition). I then asked three partici-pants (per study session) to play the User Game using our software to design either a DiningHall or a Study Lounge.

From the prior work, I expected the different privacy implemented through degrees of trans-parency to have an effect on the groups. I also expected the rules of the user game tofacilitate the participants in generating ideas. By using a turn-taking mechanism, I alter-nate the participant’s activity from direct engagement with the group, to potentially moreindividual work. Additionally, the structure of the game outlines expectations for the typesof interpersonal interactions in which the participants would engage. The different degreesof transparency also affects the participants’ “means or instrumentality to coordinate andadjust the activity to meet their needs,” [18] or coordinative agency. Therefore, I had furtherreason to expect an impact on the participants group task.

I found that participants reported symptoms of greater collaboration in the Less Transparentcondition. However, the number of contributions made by the groups were lower in the LessTransparent condition. So while participants felt that they were engaging with each other’scontributions more in the Less Transparent condition, they made fewer total contributions.Also interesting is that in the condition in which participants arguably had less coordinativeagency, they reported more engagement with each other’s ideas. Having less transparencymay have given the participants some freedom from criticism (during their time planningtheir contribution), and this may have resulted in the increased interaction between the

35

Page 46: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 5 36

participants and each other’s ideas. Another facet of the explanation may be that the jointpresentation of speech in the Less Transparent Condition created a focused moment forresponse and elaboration that was missing in the More Transparent condition.

5.1.1 Group Work, Brainstorming, and Creativity

From the prior work discussed above, I would expect that the less transparent conditionwhich arguably resembles individual work more closely, would have resulted in more con-tributions to the design [12, 19, 20, 21, 23, 29, 33, 38, 52]. However, I found that the inthe less transparent condition, fewer contributions were made. Perhaps, as hypothesized inthe Electronic Brainstorming and some of the Participatory Design literature, our partic-ular combination of turn-taking, transparency, and design games managed to mitigate theperformance-degrading effects of group work. However, this raises the thorny question ofquality of contribution, which I have not addressed.

5.1.2 Privacy

In the Less Transparent condition, because participants can only see each other’s actionsin the board, and not in the stage, there was arguably more privacy. In this more privatecondition, wherein I expect participants to have a greater feeling of solitude, I would haveexpected them to have more contributions that in the less private condition [41]. However,our video coding analysis shows that in fact the participants in the less private conditioncontributed more than those in the more private condition, this seems to contradict the priorwork [41, 40].

To explain this seeming contradiction, I might consider the term “solitude.” Pedersen definessolitude as “freedom from observation by others” [40]. In the less transparent condition ofour study, participants had some “freedom from observation by others” with respect to theiractions in the staging area. Because the participants were all seated around a round tabletogether, it cannot be argued that they were completely free from observation. However,their specific actions in the staging area could not be seen by the other participants. Whileit is possible that the other participants noticed the movements on the table of their co-participants’ hands and the mouse, they could not have correlated this to the arrangementof specific images in the game. Did the participants in the Design Game Study have solitude?Can it be said that they had solitude with respect to their actions on the board? What arethe bounds of solitude when used by Pedersen?

In questioning “solitude” and how “transparent” or “private” the conditions were in thestudy, there is a question of how exactly privacy works. How narrowly defined is “privacy”?Is “private” the opposite of “public?” Are there only two buckets? Is there a spectrum? Isit a spectrum from most transparent (least private) to least transparent (most private)? If

Page 47: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 5 37

Figure 5.1: Degrees of transparency between conditions. On the left: the less transpar-ent condition, on the right, the more transparent condition. Red represents little or notransparency, yellow indicates some transparency, and green denotes most transparency.

it is indeed a spectrum, what range of the spectrum qualifies as private? public? Are trans-parency and privacy just coincident-but-opposite valences for the same spectrum? Whilea lack of transparency certainly helps maintain or create privacy, I think that degrees oftransparency are insufficient to describe privacy.

Something that is completely transparent, but is never discovered is not public. So privacyrequires both a level of secrecy/unpopularity/unknown-ness, and sufficient opacity (lack oftransparency) to maintain this secrecy.

5.1.3 Coordination

In the study, the coordinative agency was varied by the transparency of the condition. Oneaspect of coordination would involve participants attending to each other’s actions in theapplication. From this standpoint, in our less transparent condition, participants could notsee each other’s actions in the staging area, but they would usually see each other’s actionsin the board (because the participant whose turn it was would have been narrating a storybased on their new cards). In the more transparent condition, it is possible that participantswould have attended to others’ actions in the staging area, as these actions were shared, butthere was no guarantee that this would occur. For example, if a participant moved a picturein a certain region of the staging area, but either of the other participants was scrolled awayfrom that region, they would not have noticed the change. Alternatively, if a participanthad uncovered a buried tile and moved it some where, but the other participants had notunburied that tile, it is possible they would not notice its relocation (Figure 5.1 diagramsthese conditions).

Page 48: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 5 38

With regard to coordination, from a micro-coordination perspective, I can say that the moretransparent condition had more coordinative agency as it enabled participants to see eachother’s actions in the stage. However, I interpreted the results of three of the self-responsequestions to mean that participants felt that they collaborated less in the condition withmore transparency. This contradiction invites further investigation.

5.1.4 Summary

Following a long tradition of research, I constructed a study of a group idea generation taskperformed in a CSCW application. Breaking from this tradition, I tested the ability to influ-ence behavior simply by changing the nature of privacy supported by the application. Theapplication was an implementation of Brandt and Messeter’s User Game in which partici-pants take turns adding images to the playing area that are intended to tell a story of theusers of a particular design [7]. With this small change to the interface I were able to influ-ence the nature of the collaboration among the participants. In varying the transparency ofthe interface, I observed a trade-off between the number of contributions a group produced,and their perception of building on each other. With more transparency, there were morecontributions, but the participants perception of their group members’ tendency to build ontheir ideas was diminished.

Following from these results, I suggest that future designs of CSCW applications shouldoptimize the transparency of their interface to suit the intended task. Rather than followingthe intuitive, but oversimplified tendency to make available in real-time to all group membersall others’ actions and state of their instance of the CSCW application, designers might utilizesome amount of privacy to preserve or create “coordinative agency” [18] for their users.

5.1.4.1 Limitations

The current work centered on a idea generation task performed by groups with collocatedmembers. While I cannot make strong claims about the effect of transparency on productiv-ity, I believe that other collaborative tasks would be similarly mediated to the current studyin that participants would report their groupmates built on their efforts more when therewas diminished transparency.

Due to limitations in the implementation of the CSCW application used for the study,participants were unable to see their groupmates’ images while they were being dragged.In the more transparent condition, they would see the image placed in its new location assoon as their groupmate dropped it, but while their groupmate dragged it, it would haveappeared to remain in its original location. Despite this limitation, I observed an effect ofthe transparency on the collaboration of the group members. This suggests that the effectmight be amplified if participants own images in motion would have been visible to theirgroupmates.

Page 49: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 5 39

The turn-taking structure of the task created an additional limitation. The turns were partof the original Brandt and Messeter (User Game) task [7]; however, they appear to haveinteracted differently with the different conditions of my study in a way that may haveobscured the effects of transparency.

Finally, the task I used for my application was a group design task as in [7]. While Brandtand Messeter were interested in Participatory Design, there are usually actual, trained de-signers involved, even in Participatory Design methods. Our participant pool comprisedundergraduate students taking at least one psychology class that semester, no actual design-ers.

5.2 Future Work

In the future, I would like to recruit more participants for this study. The effect size maybe too small to be noticeable with so few groups in each condition. While the total numberof participants (sixty nine) is quite high for qualitative in-lab studies in the field of Human-Computer Interaction, it may be low in comparison to the effect size, or to the standards ofother fields.

5.2.1 Quality

In the current study the primary form of data analysis was self-report and video transcriptionand coding. While the findings make claims about the number of contributions, the currentstudy does not support findings based on quality. In the future, perhaps quality-raters couldbe recruited. These raters would be designers, and could be given the coded transcripts.They would independently rate the quality of the ideas, and then inter-rater reliability testscould be performed.

5.2.2 Analysis of Variance of Interaction of Participants by Con-dition

The current study includes findings from participants’ self-report about their feelings of howmuch they built on the ideas of the other participants and vice versa. However, there isno more objective data analysis. The software the participants used in the study loggedinformation to files for later analysis. The logged data consists of information about userinterface actions. The salient parts of some of the log statements are provided in Table5.1 as examples. If one were to write a custom parser, or perhaps just using a few regularexpressions, they could parse this data for the timestamps associated with each event followedby a short description of the event. Using this transformation of the data, it would be possible

Page 50: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 5 40

Example Log Data Interpretation of the Example Log Data“mouseMove ... absoluteCursorLocation:Point 1226, 103”

mouse has moved to position (1226, 103)

“...Moment Card Tuple written:edu.vt.cs.server.MomentCardTuple[ Tu-pleType: “MomentCard” , “source:server/imageFileName.JPG” , ... space:“stage” , x: 〈java.lang.Integer〉 745 , y:〈java.lang.Integer〉 62 ..”

the image, “imageFileName” is now locatedat 745, 62 in the stage.

“...Moment Card Tuple written:edu.vt.cs.server.MomentCardTuple[ Tu-pleType: “MomentCard” , source:“server/imageFileName.JPG” , ... space:“board” , x: 〈java.lang.Integer〉 6600 , y:〈java.lang.Integer〉 0 ...”followed by“...Changing space from ‘stagePhantom’ to‘board’...”

image moved from stage to board, now atposition 6600, 0

Table 5.1: Examples of Log Data and their Interpretations

to have timestamped user actions that would be a transcript of the participants’ activitiesin the application. Analyzing this in conjunction with the transcript would permit claimsabout the participants specific actions in the task and their potential variance with the studycondition.

5.2.3 Extension of Study with Modified Interface

There was an apparent contradiction in the results of the study in terms of micro-coordination.Contrary to prior work on coordinative agency [18], participants seemed to think that theywere less collaborative in the more transparent condition. A possible explanation could bethat even though the more transparent condition created the possibility that the participantscould see each other’s actions in the staging area, there is no proof that they attended toeach other’s actions there.

Perhaps a future study could study differences between the less transparent condition usedin this study (in which participants staging areas were private), and a new condition in whichparticipants actions in the staging area are necessarily seen by the others (see Figure 5.2 fora diagram of this condition in comparison to the two from this study).

Page 51: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 5 41

Figure 5.2: Degrees of transparency between conditions. The left two conditions show theconditions from the current study, the right-most condition is proposed as future work. Redrepresents little or no transparency, yellow indicates some transparency, and green denotesmost transparency.

Page 52: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Bibliography

[1] Black lab. http://blacklab.cs.vt.edu/.

[2] Center for human-computer interaction at virginia tech. http://hci.vt.edu/.

[3] Virginia tech psychology department. http://www.psyc.vt.edu.

[4] E. Aaronson, T. Wilson, and R. Akert. Social psychology: The heart and the mind,1994.

[5] S. Alinsky. Rules for radicals, 1971.

[6] G. Bishop. Gary bishop in cs at unc-chapel hill. http://www.cs.unc.edu/~gb/.

[7] E. Brandt and J. Messeter. Facilitating collaboration through design games. In Proceed-ings of the eighth conference on Participatory design: Artful integration: interweavingmedia, materials and practices - Volume 1, PDC 04, pages 121–131, New York, NY,USA, 2004. ACM.

[8] F. Brooks. The mythical man-month, volume 79. Addison-Wesley Reading, Mass, 1975.

[9] V. Brown, M. Tumeo, T. Larey, and P. Paulus. Modeling cognitive interactions duringgroup brainstorming. Small group research, 29(4):495–526, 1998.

[10] Camstudio. Camstudio screen recording software. http://camstudio.org/, November2012.

[11] P. Carew and L. Stapleton. Towards a privacy framework for information systemsdevelopment. Information Systems Development, pages 77–88, 2005.

[12] D. Cohen, J. Whitmyre, and W. Funk. Effect of group cohesiveness and training uponcreative thinking. Journal of Applied Psychology, 44(5):319–322, 1960.

[13] T. Connolly, L. Jessup, and J. Valacich. Effects of anonymity and evaluative tone on ideageneration in computer-mediated groups. Management science, 36(6):689–703, 1990.

42

Page 53: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 5 43

[14] N. Cottrell, D. Wack, G. Sekerak, and R. Rittle. Social facilitation of dominant responsesby the presence of an audience and the mere presence of others. Journal of Personalityand Social Psychology, 9(3):245, 1968.

[15] DataVideo. nnovia data video hard drive recorder. http://www.

datavideo.us/datavideo-product-families/datavideo-video-recorders/

datavideo-dn-200-hard-drive-recorder/.

[16] J. DeCew. In pursuit of privacy: Law, ethics, and the rise of technology. CornellUniversity Press, 1997.

[17] A. Dennis and J. Valacich. Computer brainstorms: More heads are better than one.Journal of applied psychology, 78(4):531, 1993.

[18] M. Dickey-Kurdziolek, M. Schaefer, D. Tatar, and I. Renga. Lessons from thoughtswap-ing: increasing participants’ coordinative agency in facilitated discussions. In Proceed-ings of the 2010 ACM conference on Computer supported cooperative work, pages 81–90.ACM, 2010.

[19] M. Diehl and W. Stroebe. Productivity loss in brainstorming groups: Toward thesolution of a riddle. Journal of Personality and Social Psychology; Journal of Personalityand Social Psychology, 53(3):497, 1987.

[20] M. Diehl and W. Stroebe. Productivity loss in idea-generating groups: Tracking downthe blocking effect. Journal of personality and social psychology, 61(3):392, 1991.

[21] L. Festinger, S. Schachter, and K. Back. Social pressures in informal groups, 1950.

[22] R. Gallupe, L. Bastianutti, and W. Cooper. Unblocking brainstorms. Journal of AppliedPsychology, 76(1):137, 1991.

[23] J. Hackman and C. Morris. Group tasks, group interaction process, and group perfor-mance effectiveness: A review and proposed integration. Defense Technical InformationCenter, 1975.

[24] E. Hornecker. Creative idea exploration within the structure of a guiding framework:the card brainstorming game. In Proceedings of the fourth international conference onTangible, embedded, and embodied interaction, TEI ’10, pages 101–108, New York, NY,USA, 2010. ACM.

[25] P. Hunt and J. Hillery. Social facilitation in a coaction setting: An examination of theeffects over learning trials. Journal of Experimental Social Psychology, 9(6):563–571,1973.

[26] International Business Machines. Ibm tspaces. http://www.almaden.ibm.com/cs/

tspaces/, 11 2012.

Page 54: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 5 44

[27] J. Jacobs. The life and death of great american cities, 1961.

[28] W. Jones and J. Teevan. Personal Information Management. Univ of Washington Pr,10 2007.

[29] H. Lamm and G. Trommsdorff. Group versus individual performance on tasks requiringideational proficiency (brainstorming): A review. European journal of social psychology,3(4):361–388, 1973.

[30] J. S. Lee, D. Tatar, and S. Harrison. Micro-coordination: because we did not alreadylearn everything we need to know about working with others in kindergarten. In Pro-ceedings of the ACM 2012 conference on Computer Supported Cooperative Work, CSCW’12, pages 1135–1144, New York, NY, USA, 2012. ACM.

[31] R. Likert. A technique for the measurement of attitudes. Archives of psychology, 1932.

[32] Microsoft. Microsoft excel 2011 for mac. http://www.microsoft.com/mac/excel.

[33] B. Nijstad, W. Stroebe, and H. Lodewijkx. Production blocking and idea generation:Does blocking interfere with cognitive processes? Journal of Experimental Social Psy-chology, 39(6):531–548, 2003.

[34] Oracle. Java swing. http://docs.oracle.com/javase/1.4.2/docs/api/javax/

swing/package-summary.html, 11 2012.

[35] A. Osborn. Applied imagination. 1953.

[36] P. Paulus. Groups, teams, and creativity: The creative potential of idea-generatinggroups. Applied Psychology, 49(2):237–262, 2000.

[37] P. Paulus, M. Dzindolet, G. Poletes, and L. Camacho. Perception of performancein group brainstorming: The illusion of group productivity. Personality and SocialPsychology Bulletin, 19(1):78–89, 1993.

[38] P. B. Paulus, T. S. Larey, and A. H. Ortega. Performance and perceptions of brainstorm-ers in an organizational setting. Basic and Applied Social Psychology, 17(1-2):249–265,1995.

[39] D. Pedersen. Dimensions of privacy. Perceptual and Motor Skills, 48(3c):1291–1297,1979.

[40] D. Pedersen. Psychological functions of privacy. Journal of Environmental Psychology,17(2):147–156, 1997.

[41] D. Pedersen. Model for types of privacy by privacy functions. Journal of environmentalpsychology, 19(4):397–405, 1999.

Page 55: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Chapter 5 45

[42] J. Pissarra, C. J. Costa, and M. Aparicio. Brainstorming reconsidered in computer-mediated communication and group support system context. In Proceedings of theWorkshop on Information Systems and Design of Communication, ISDOC ’12, pages45–50, New York, NY, USA, 2012. ACM.

[43] Qualtrics. Qualtrics surveys. https://virginiatech.qualtrics.com/.

[44] R Project. The r project for statistical computing. http://www.r-project.org/.

[45] B. Reeves, A. Lang, E. Kim, and D. Tatar. The effects of screen size and messagecontent on attention and arousal. Media Psychology, 1(1):49–67, 1999.

[46] SAS. Jmp software for data analysis. http://www.jmp.com/.

[47] B. Shapiro and C. Baker. Information technology and the social construction of infor-mation privacy. Journal of Accounting and Public Policy, 20(4):295–322, 2002.

[48] P. Skeidsvoll. Latex table editor. http://truben.no/latex/table/.

[49] StackExchange. Latex stack exchange. http://tex.stackexchange.com/users/

21586/michael?tab=votes.

[50] Sun Systems. Java 1.4.2. http://www.oracle.com/technetwork/java/javase/

index-jsp-138567.html, 11 2012.

[51] D. Tatar and S. Harrison. Hcc-small: Human micro-coordination in a world of pervasivecomputing: Understanding emotional, personal, interpersonal and behavioral intercon-nections. http://nsf.gov/awardsearch/showAward?AWD_ID=1018607.

[52] D. W. Taylor, P. C. Berry, and C. H. Block. Does group participation when usingbrainstorming facilitate or inhibit creative thinking? Administrative Science Quarterly,3(1):pp. 23–47, 1958.

[53] Third Lab at Virginia Tech. Third lab. http://third.cs.vt.edu/.

[54] N. Triplett. The dynamogenic factors in pacemaking and competition. The Americanjournal of psychology, 9(4):507–533, 1898.

[55] TrueCrypt. Truecrypt disk encryption. http://www.truecrypt.org/.

[56] U.N. General Assembly. Universal declaration of human rights. Resolution adopted bythe General Assembly, 10(12), 1948.

[57] Virginia Tech. Virginia tech corporate research center. http://www.vtcrc.com/.

[58] VT-CHCI. User game source code. https://github.com/VT-CHCI/User-Game.

Page 56: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Bibliography 46

[59] A. Warr and E. O’Neill. Understanding design as a social creative process. In Proceedingsof the 5th conference on Creativity & Cognition, pages 118–127. ACM, 2005.

[60] A. Westin. Privacy and freedom. Washington and Lee Law Review, 25(1):166, 1968.

[61] C. E. Wilson. Brainstorming pitfalls and best practices. interactions, 13(5):50–63, Sept.2006.

[62] R. Zajonc et al. Social facilitation. Research Center for Group Dynamics, Institute forSocial Research, University of Michigan, 1965.

Page 57: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Appendix A

Questionnaire

At the end of each session, the participants closed the game software on the laptop they wereusing, and found a questionnaire open in a browser window. The participants completedthis online questionnaire and then were free to leave. A copy of the questionnaire is includedhere for reference.

47

Page 58: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Appendix A 48

Page 59: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Appendix A 49

Page 60: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Appendix A 50

Page 61: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Appendix A 51

Page 62: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Appendix A 52

Page 63: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Appendix A 53

Page 64: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Appendix A 54

Page 65: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Appendix B

Guidelines for Researchers InstructingParticipants

As credited in my Acknowledgements, I had the great fortune of having undergrad researcherswork with me for almost every phase of this project. Data collection, that is, running thestudy with the recruited participants was one place that I was best able to utilize theirassistance. Because this meant I would not be at all the sessions myself, and to betterdocument my study, I prepared the following guidelines for how to run the study and instructthe participants. The following is one section of this document, which was intended primarilyto help the researcher introduce the tool and the task the participants would be completingto the participants.

B.1 Description of Study to Participants

Researcher: Please take a station. Now, let me explain the controls. You use the mouse toclick and drag the pictures. So on your turn, you will choose some pictures, and drag themone at a time from the staging area, which is the lower half of the screen, to the playing areain the top half. You can also drag the images around in the staging area regardless of whoseturn it is, this will not count as your turn. When one person finishes adding their tiles andtelling their story, its the next persons turn.

For the less private condition:Researcher: Whenever you move an image, this change will be visible to all of you as ithappens on everyones machine.

For the more private condition:Researcher: When you move or place an image in the playing area (the top half of yourscreen), everyone will see this as it will also happen on their machine. However, when you

55

Page 66: How Private is Private?: E ects of Varied Transparency on Group Ideation · 2020-01-20 · How Private is Private?: E ects of Varied Transparency on Group Ideation Michael C. Stewart

Michael C. Stewart Appendix B 56

move the images in the lower half of your screen, they will only move on your machine. Youractions will not be visible to your collaborators.

Both Conditions:We are trying to understand how people design collaboratively on computers. Sometimesdesigners use techniques, such as games to help get their process started, and to help themdevelop a shared understanding of the thing they are trying to design. We have a designgame that we are going to ask you to play. Your job is to get started with the design of a〈their conditions design target〉. To do this, designers often envision people using the settingthey are designing. They consider the peoples activities, feelings, care-abouts, company,how long they will be there, and any other things about the use of the space that they can.They come up with a set of associations that they want people to have with the place. Forexample, Starbucks wants you to see their store as an alternative place to live in additionto home or workplace. On your turn, choose two to four images from the staging area thatyou feel tell a story about the possible users of 〈their conditions design target〉. If you arethe first person, then you will use five or more images. However, after the first persons firstturn, everyone will only add two to four cards. After the first players first turn, you mustadd on to the existing cards, sort of like Scrabble or a crossword puzzle. So you will haveto use at least one tile from a previous turn, even if you dont use the story that player told.Do you have any questions so far?

Researcher: ok, please begin.