Coordinating technology: Studies in the international standardization of telecommunications

3
“Knowledge Representation Methods” first discusses methods grounded in library science. The commonly used enumerative systems like the Dewey and Library of Congress classification schemes are described in detail, and passing reference is made to faceted approaches. Techniques based in computer and informa- tion science, like hypertext, vector models, and computationally based techniques for disambiguating word meanings are also de- scribed. “Distribution” compares distribution of information via physi- cal media (e.g., CD-ROM) to distribution via current network technologies (i.e., the World Wide Web). The most valuable part of this chapter discusses the weaknesses of Internet and Web technology for providing trustworthy, secure, and private access to content, and trustworthy payment mechanisms. “Usability and Retrieval Evaluation” combines discussions of the design of usable systems and methods for evaluating system effectiveness. These topics are relevant to the construction of practical digital libraries, but the focus of the chapter is heavily weighted towards system design issues rather than evaluation techniques. “Collections and Preservation” reviews some of the complex issues concerning what it means to perform collection develop- ment for digital materials, as well as the problematic issues of providing long-term access to digitized material. Specific contem- porary issues like digitization as a technique for preserving infor- mation printed on acid-based paper are discussed. The challenges of applying digital techniques to archives and special materials, like maps, are also covered here. Lesk provides a detailed discus- sion of the preservation problems posed by the rapid rate at which computerized systems become obsolete. “Economics” provides a wide-ranging overview of topics including library funding models, the economics of scholarly publishing, and the opportunities and barriers to evolving existing systems and markets toward Internet- based systems. Typical of these issues is the conflict between information users, who may realize that there is no longer a requirement for more than two copies of any item (one copy existing primarily as a “backup” copy) , and the interests of for-profit content providers, who are concerned about maintenance of revenue streams in the networked, digitized environment. “Intellectual Property Rights” discusses the unsettled and hotly debated topics of copyright, patent law, fair use, and how existing laws and treaties relate to networked intellectual property. Lesk also relates the fears of governments concerning the use of the Internet by drug dealers, terrorists, and spies. The chapter con- cludes with brief descriptions of a number of technologies, like digital watermarks, cryptolopes, and flickering, which attempt to address these issues. “International Activities” provides very brief summaries of current and recent digital library projects in the U.S., Europe, and Asia. The final chapter, “Future: Ubiquity, Creativity, and Public Policy” summarizes the book and considers likely future developments in digital libraries. Here Lesk offers his and other experts’ views on the future of digital libraries, and the interrelationships between public policy, law (and lawsuits) , mar- ket forces, and the development of technology. There are few criticisms to make of this book. One is that the inclusion of a set of full-color plates may not have been necessary. While interesting, most are not central to developing an under- standing of the issues raised in the text, and they drive the cost of the book upward. Eleven pages of the chapter “Distribution” are needlessly devoted to describing basic network technologies and Internet protocols. The short subsection on evaluation in Chapter 7 unfortunately discusses only precision and recall studies, which are extremely poor indicators of overall system quality. Lesk does not equivocate in his writing, and in some cases, his statements, presented in a factual tone, are arguable. For example, Lesk writes “High-energy physics now depends entirely on a bulletin board at Los Alamos National Laboratory” (p. 214) , implying that the role of the traditional scientific journal has been overturned by this low-tech repository of preprints. It is probable that Lesk did not mean to imply that physicists no longer publish in journals, but the quote in the book yields that impression. Finally, the index is not as well constructed as it might be, making references to both subjects and names difficult to find. For example, there was no easy way to locate the above quote via the index because there are no index entries for Los Alamos, preprints, physics, or Paul Gin- sparg, who administers the bulletin board. The index entry “bul- letin board” points to an unrelated reference in another part of the book. Librarians, computer scientists, information scientists, and stu- dents with a need to get a sense of the field will find this book valuable. It is an excellent textbook for an introductory course dealing with digital libraries and networked information delivery. The book also will give researchers and practitioners, with a need to design and build digitized collections of content, a sense of some of the important trade-offs between technological choices. It also describes possible approaches to resolving the fundamental conflicts between the content providers, who worry about loss of control of intellectual property, and the desires of users and infor- mation system providers to enable access to content. This book will be less useful to those already involved in the design and operation of digital libraries because of its general treatment of these topics. Practical Digital Libraries includes an 11-page list of references, which includes many important papers, books, and conference proceedings. Some books become instant classics as soon as they are pub- lished. Practical Digital Libraries: Books, Bytes, and Bucks is the first classic treatment of digital libraries. Despite some flaws, future comprehensive treatments of the subject will be compared to this, the standard text. Michael Lesk has succeeded in bringing both order and a reasoned interpretation to what is a dynamic interdiscipline. The result is a volume equally valuable for aca- demics, practitioners, and students. Robert J. Sandusky CANIS—Community Systems Laboratory Graduate School of Library and Information Science University of Illinois at Urbana-Champaign Champaign, IL 61820-6211 E-mail: [email protected] Coordinating Technology: Studies in the International Stan- dardization of Telecommunications. Susanne K. Schmidt and Raymund Werle. Cambridge, MA: The MIT Press; 1998: 365 pp. Price: $85.00. (ISBN 0-262-19393-0.) Glancing at this title, one may think that this book is primarily case studies about telecommunications. In part, it is. But, this small piece of a much bigger whole is only a means to explore the grander issue of standardization and its facilitation of coordination and compatibility. Schmidt and Werle aim to draw attention to the social and political shaping of standards. Unlike previous research, the authors do not exclusively focus on economic perspectives. Since this text is literally so involved and complex, it is an impossible task to do anything other than barely address the more salient points. The first part of this book (Chapters 1– 4) , called “Theorizing Standards,” introduces the reader to standards and their key role of coordination. After laying down a firm theoretical base, the authors proceed to analyze the development of standards and give an overview of the many telecommunication standardization organi- zations. The final issues addressed in this section are standardiza- tion “games” and a general model of the standardization process. JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE—September 1998 1039

Transcript of Coordinating technology: Studies in the international standardization of telecommunications

Page 1: Coordinating technology: Studies in the international standardization of telecommunications

“Knowledge Representation Methods” first discusses methodsgrounded in library science. The commonly used enumerativesystems like the Dewey and Library of Congress classificationschemes are described in detail, and passing reference is made tofaceted approaches. Techniques based in computer and informa-tion science, like hypertext, vector models, and computationallybased techniques for disambiguating word meanings are also de-scribed.

“Distribution” compares distribution of information via physi-cal media (e.g., CD-ROM) to distribution via current networktechnologies (i.e., the World Wide Web). The most valuable partof this chapter discusses the weaknesses of Internet and Webtechnology for providing trustworthy, secure, and private access tocontent, and trustworthy payment mechanisms. “Usability andRetrieval Evaluation” combines discussions of the design of usablesystems and methods for evaluating system effectiveness. Thesetopics are relevant to the construction of practical digital libraries,but the focus of the chapter is heavily weighted towards systemdesign issues rather than evaluation techniques.

“Collections and Preservation” reviews some of the complexissues concerning what it means to perform collection develop-ment for digital materials, as well as the problematic issues ofproviding long-term access to digitized material. Specific contem-porary issues like digitization as a technique for preserving infor-mation printed on acid-based paper are discussed. The challengesof applying digital techniques to archives and special materials,like maps, are also covered here. Lesk provides a detailed discus-sion of the preservation problems posed by the rapid rate at whichcomputerized systems become obsolete. “Economics” provides awide-ranging overview of topics including library funding models,the economics of scholarly publishing, and the opportunities andbarriers to evolving existing systems and markets toward Internet-based systems. Typical of these issues is the conflict betweeninformation users, who may realize that there is no longer arequirement for more than two copies of any item (one copyexisting primarily as a “backup” copy) , and the interests offor-profit content providers, who are concerned about maintenanceof revenue streams in the networked, digitized environment.

“Intellectual Property Rights” discusses the unsettled and hotlydebated topics of copyright, patent law, fair use, and how existinglaws and treaties relate to networked intellectual property. Leskalso relates the fears of governments concerning the use of theInternet by drug dealers, terrorists, and spies. The chapter con-cludes with brief descriptions of a number of technologies, likedigital watermarks, cryptolopes, and flickering, which attempt toaddress these issues. “International Activities” provides very briefsummaries of current and recent digital library projects in the U.S.,Europe, and Asia. The final chapter, “Future: Ubiquity, Creativity,and Public Policy” summarizes the book and considers likelyfuture developments in digital libraries. Here Lesk offers his andother experts’ views on the future of digital libraries, and theinterrelationships between public policy, law (and lawsuits) , mar-ket forces, and the development of technology.

There are few criticisms to make of this book. One is that theinclusion of a set of full-color plates may not have been necessary.While interesting, most are not central to developing an under-standing of the issues raised in the text, and they drive the cost ofthe book upward. Eleven pages of the chapter “Distribution” areneedlessly devoted to describing basic network technologies andInternet protocols. The short subsection on evaluation in Chapter 7unfortunately discusses only precision and recall studies, which areextremely poor indicators of overall system quality. Lesk does notequivocate in his writing, and in some cases, his statements,presented in a factual tone, are arguable. For example, Lesk writes“High-energy physics now depends entirely on a bulletin board atLos Alamos National Laboratory” (p. 214) , implying that the roleof the traditional scientific journal has been overturned by thislow-tech repository of preprints. It is probable that Lesk did not

mean to imply that physicists no longer publish in journals, but thequote in the book yields that impression. Finally, the index is notas well constructed as it might be, making references to bothsubjects and names difficult to find. For example, there was noeasy way to locate the above quote via the index because there areno index entries for Los Alamos, preprints, physics, or Paul Gin-sparg, who administers the bulletin board. The index entry “bul-letin board” points to an unrelated reference in another part of thebook.

Librarians, computer scientists, information scientists, and stu-dents with a need to get a sense of the field will find this bookvaluable. It is an excellent textbook for an introductory coursedealing with digital libraries and networked information delivery.The book also will give researchers and practitioners, with a needto design and build digitized collections of content, a sense ofsome of the important trade-offs between technological choices. Italso describes possible approaches to resolving the fundamentalconflicts between the content providers, who worry about loss ofcontrol of intellectual property, and the desires of users and infor-mation system providers to enable access to content. This bookwill be less useful to those already involved in the design andoperation of digital libraries because of its general treatment ofthese topics.Practical Digital Libraries includes an 11-page list ofreferences, which includes many important papers, books, andconference proceedings.

Some books become instant classics as soon as they are pub-lished.Practical Digital Libraries: Books, Bytes, and Bucksis thefirst classic treatment of digital libraries. Despite some flaws,future comprehensive treatments of the subject will be comparedto this, the standard text. Michael Lesk has succeeded in bringingboth order and a reasoned interpretation to what is a dynamicinterdiscipline. The result is a volume equally valuable for aca-demics, practitioners, and students.

Robert J. SanduskyCANIS—Community Systems LaboratoryGraduate School of Library and Information ScienceUniversity of Illinois at Urbana-ChampaignChampaign, IL 61820-6211E-mail: [email protected]

Coordinating Technology: Studies in the International Stan-dardization of Telecommunications. Susanne K. Schmidt andRaymund Werle. Cambridge, MA: The MIT Press; 1998: 365 pp.Price: $85.00. (ISBN 0-262-19393-0.)

Glancing at this title, one may think that this book is primarilycase studies about telecommunications. In part, it is. But, this smallpiece of a much bigger whole is only a means to explore thegrander issue of standardization and its facilitation of coordinationand compatibility. Schmidt and Werle aim to draw attention to thesocial and political shaping of standards. Unlike previous research,the authors do not exclusively focus on economic perspectives.Since this text is literally so involved and complex, it is animpossible task to do anything other than barely address the moresalient points.

The first part of this book (Chapters 1–4) , called “TheorizingStandards,” introduces the reader to standards and their key role ofcoordination. After laying down a firm theoretical base, the authorsproceed to analyze the development of standards and give anoverview of the many telecommunication standardization organi-zations. The final issues addressed in this section are standardiza-tion “games” and a general model of the standardization process.

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE—September 1998 1039

Page 2: Coordinating technology: Studies in the international standardization of telecommunications

The second part of the book (Chapters 5–9) , called “MakingStandards,” contains three case studies of international committeestandardization. These studies are of interactive videotex, facsim-ile transmission, and electronic message handling (X.400). Eachstudy follows “the trail from realization by certain actors that astandard is needed through complex negotiations that involvemany economic, political, and social interests (all couched intechnical terms) to final agreement on a standard” (p. 1).

The concluding section (Chapters 10–13) , aptly called “Inter-pretation and Generalization,” takes the preceding theoreticalguides and empirical case studies and elaborates upon them, pre-senting “a more comprehensive picture of technical coordinationthrough institutionalized international standardization” (p. 265).

“Standards are abstract specifications of the necessary featuresof a component that make it compatible with the rest of thesystem—they ensure its9fit”’ (p. 3). In other words, standardsfacilitate or are “mediums” of compatibility. Since the world is fullof many different telecommunication systems, standardizationseeks to achieve some level of coordination or interoperationbetween divergent platforms. Although standardization does notnecessarily choose the “best” option (s) or alternatives (s) , it seeksto arrive at some common ground that promotes compliance as a“collective acknowledgment of technological interdependence” (p.120).

The authors’ intent is to understand the process of standardiza-tion, not to “develop a social theory of technical standardization,since it makes no sense to decorate every social phenomenon withits own substantive theory” (p. 16). The frame of reference to beused in this study is called actor-centered institutionalism. Thisapproach endeavors to relate institutions (meso-level elements) toactors (micro-level elements) , thus partnering those social agentswhich work jointly to influence and mold technology. This linkingallows one to utilize, clarify, and synthesize various social, polit-ical, and economic theories into an analytical framework. Actor-centered institutionalism avoids some of the criticisms leveledagainst social constructivism, i.e., lack of explanatory power,social determinism, and micro-centeredness. “What is gained bythis effort is a better fit between theoretical perspectives andobserved processes of technical choice and technological develop-ment” (p. 17).

One may wonder why the authors have concentrated on tele-communication technology and standards produced by the Comite´Consultatif International Te´legraphique et Te´lephonique (CCITT).Several reasons exist for this focus. The CCITT is the “mostprominent international producer of telecommunication standards”(p. 2). By virtue of the fairly neutral environment, secrecy tends tobe less of a hindrance to social research than the corporate setting.Telecommunications also provides a unique perspective on thegeneration of technology. In particular, the authors note that “[o]urdecision to concentrate on three standards processed mainly withinthe CCITT precludes a systematic comparative assessment of theimpact of institutional variables on standards. Thus, by keeping theorganizational context constant, we deliberately eliminate thissource of variation” (p. 110).

When looking at compatibility, one can see two “faces”—compatible complements and compatible substitutes. The formerare vertically compatible; Y is used with Z. The latter are hori-zontally compatible or functionally equivalent (in other words,competitors) ; either X or Y can be used with Z.

Compatibility can be further investigated by using game theory.“Game theory offers a terminology and basic concepts for mod-eling and analyzing actors as “players” in games, where the indi-vidual “payoff ” depends not only on individual choice but also onwhat the other players do” (p. 100). The pure-coordination gameand the battle of the sexes are “positive-sum” games in which anagreement on a solution is reached due to a higher overall payoffbeing achieved. The zero-sum game and the prisoner’s dilemmaillustrate coordination failure in which no standard is agreed upon.

An heuristic model comprising three sections (structural as-pects, process aspects, and output) is offered as a tool for the studyof the standardization process. Structural aspects include the insti-tutional framework (rules and organizational relations) , actors(motives, interests, perceptions, and resources) , and the techno-logical foundation (feasibility, knowledge base, designs, and prob-lems). Process aspects include the decision-making process whichincorporates problem complexity, actor constellation, strategies,and dynamics. The output is the standard either approved or not.The authors note:

For the sake of conceptual clarity (which is missing in many essayson standardization) , we emphasize at this point that the standardswhose development we consider in our case studies belong to theclass of coordinative standards. Thus, our model of standardizationprocess applies only to coordinative standards. (p. 119)

The three case studies are presented according to the followingarrangement: An overview of the standard’s history, central tech-nical and economic problems, actors and interests, minor andmajor conflicts, and conflict resolution or termination. This struc-ture allows the reader to easily compare and contrast variouselements of the standardization process.

Interactive videotex is basically an information-retrieval sys-tem that utilizes a telephone line to access the information sought,be it graphics or text, and a monitor or television to displayretrieved items. This case study is a prime example of whathappens when the actors involved tie economic expectations to thestandards proposed; competition overshadows compromise. Nosingle standard was agreed upon, only a stalemate solution ofseveral different options.

Facsimile was first patented in 1843, but came into its own after1980. It has become the most widely used means to transmit textvia telecommunication systems. The nearly ubiquitous diffusion itenjoys is based upon many factors, significant among them isstandardization. Not without its share of conflicts, “facsimile is animpressive example of the limits of market-based standardization,the significant potential of committees, and the benefits of techni-cal coordination” (p. 185).

X.400 is an electronic mail standard that saw its beginning in1978. Different than the previous two technologies, a pre-standard-ization committee, called the International Federation for Informa-tion Processing (IFIP) , began work on this standard, since the ideafor a standardization project transpired during the middle of aCCITT study period. Even though X.400 has not made a greatimpact on the Internet, this case study is an excellent example ofa successful attempt at developing an ex ante, comprehensivestandard.

The concluding section of this book is a comprehensive exam-ination of international standardization and telecommunications. Ingeneral, telecommunications involves a worldwide system with amultitude of actors using diverse connections. Since there is nocentral entity that governs how these various nodes must connect,compatibility must exist in order to provide interoperation betweenterminals. And the only way to foster compatibility is throughcoordinated action. This action results in standards worked out byinterested parties in a neutral arena—the standardization organi-zation. The organizations “are not simply clearing agencies thatselect a technical solution from a given pool. Rather, they createtheir own solutions, which later, as elements of the organizationshistory and tradition, shape the work on related standards” (p.306).

Possessing an extensive list of references and an excellentindex, this book is an important study regarding internationalstandardization. Schmidt and Werle have drawn upon many com-plementary theories to illustrate their ideas, and have done soadmirably. Frankly, there has been so much synthesis of thoughtthat the intricate weave, at times, can overwhelm the reader. I feel

1040 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE—September 1998

Page 3: Coordinating technology: Studies in the international standardization of telecommunications

that the best strategy is to keep pushing forward and acquiring thesection’s essence. Rereading the portion later will prove to be aworthwhile effort. This book is a must for academic libraries, largepublic libraries with a deep technology collection, relevant speciallibraries, and anyone who is interested in standardization andtechnology.

Jeff WhiteMedical Group Management Association1012 Louisiana PlaceLongmont, CO 80501-6625E-mail: [email protected]

The Soft Edge: A Natural History and Future of the Informa-tion Revolution. Paul Levinson. London and New York: Rout-ledge; 1997: 257 pp. Price: $25.00. (ISBN 0-415-15785-4.)

This work is about “the difference that communications mediamake in our lives” (p. xi). It has a broad historical scope, rangingfrom the development of the alphabet to the future of artificialintelligence, and takes a non-technical approach. Its scope andperspective promise great interest.

There are a number of isolated apercus. We learn that in ancientGreece, democracy was defined buy the extent to which an audi-ence could hear a speaker’s voice, although the implied congru-ence between a discourse and speech community is not noticed.Nathaniel Hawthorne’s 1851 comment on the telegraph is quoted:“Is it a fact—or have I dreamed it—that, by means of electricity,the world of matter has become a great nerve vibrating thousandsof miles in a breathless point of time?” (p. 127). The possibilityimplied, by the date and geopolitical context of the quotation, thatthe mid- to late-19th century represents the critical period for theadoption of technologies subsequently subsumed under the genericterm information technology, and that they were produced by thedemands of American continental expansion for communicationacross space, is not explored. The quotation is indicative of theperspective assumed, which is from the United States, with littledeep sense of the possibility of other cultural perspectives oninformation developments.

The historical range of reference of the work is not matched bya deeper historical understanding. For instance, it is asserted thatno group without writing has achieved a civilization: Homeric, andother sophisticated, primarily oral, cultures provide counter-exam-ples. The treatment of the development of copyright is particularlyunsatisfactory. For Levinson, the “biological antiquity of proper-ty . . . is nodoubt what led to the development of provisions forintellectual property when the printing press for the first time putsuch property in the form of books of everyone’s shelf” (pp.202–203). Yet non-American cultures, most notably feudal andMarxist ones, have valued copying and dissemination above per-sonal property rights, if these were even acknowledged—for in-stance, Chinese printing, whose early development is remarked byLevinson, did not give rise to Western concepts of copyright. Thetransformation of copyright from a printers’ to an author’s right inthe late-18th and early-19th centuries is seen as a correction, not asa transformation which could be connected to the concurrentemergence of the Romantic conception of the author as a creativeindividual. Significant, potentially highly relevant sources are ne-glected in discussions of other topics: For instance, considerationsof orality and of the history of writing show no awareness of theclassic work of Diringer or the more recent studies by Ong, Gaur,and Harris.

There are also theoretical limitations. The distinction betweennatural and man-made artifacts is obscured in the proposal for thefurther development of “evolutionary epistemology,” which wouldstudy the analogies between “the evolution of biological organismsand the evolution of human knowledge” (p. xvi). While there maybe analogies in patterns of development, the mechanisms areradically different (except in the curious and significant cases ofselective breeding and genetic engineering), one an effect of na-ture, the other the product of human intellectual labor. There is acorrespondingly limited recognition of technology as a humanconstruction. The crucial distinction between invention and inno-vation (or social diffusion) is hinted at but not consistently devel-oped. A causal role is attributed to information technology devel-opments: The alphabet is regarded as an influence towards mono-theism, not simply as a carrier for its dissemination; and capitalismimplied to be a product of the printing press. In summary, theapproach implied by Walter Benjamin’s remark that, “within thephenomenon [of the possibilities of reproduction of art objects]which we are here examining from the perspective of worldhistory, print is merely a special, though particularly important,case” (Benjamin, 1969) is partly realized through the scope ofhistorical reference. The associated, and persuasive, position,which gives full primacy to the economic and cultural over thetechnological, that the adoption of printing and the wider dissem-ination of texts was a product of the stress on personal consciencein Protestantism is not explored.

The method of argument tends to be that of assertion—thephrase “no doubt” recurs to link questionable causalities. The styleitself has the fluidity characteristic of spoken delivery, as well assome more clearly marked oral linguistic features, and the rapidtransition between topics also found in compilations of shortpieces. The subtlety of sources can be obscured: In the discussionof Socrates’ objections to writing as inert and unresponsive, it isnot noted that similar objections are made to oral communicationdelivered as a speech rather than dialectically engaged; Franken-stein’s status as a victim of human conduct in Mary Shelley’snarrative, as well as when represented by Boris Karloff, is notnoticed; and, most crucially, the reservations on free speech asundivided good made in classical liberal discussions (which, forinstance, attend to the time and place of utterances, and do notcondone incitement to riot) are neglected.

In conclusion, the scope and ease of reading of the work makeit valuable, despite its deeper deficiencies. Its broader value couldbe said to lie in its failure to fulfill its intentions: It simultaneouslyexposes an interesting area and the need to occupy that areacoherently and intelligently.

Julian WarnerSchool of ManagementThe Queen’s University of BelfastBelfast BT7 1NNNorthern IrelandE-mail: [email protected]

ReferenceBenjamin, W. (1969). The work of art in the age of mechanical reproduc-

tion. In H. Arendt (Ed.) & H. Zohn (Trans.),Illuminations (pp. 217–251). New York: Schocken Books.

The Death of Distance: How the Communications RevolutionWill Change Our Lives. Frances Cairncross. Boston, MA: Har-vard Business School Press; 1997: 303 pp. Price: $19.95. (ISBN0-87584-806-0.)

There is no shortage of books presuming to forecast the con-sequences of current developments in emergent communications

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE—September 1998 1041