Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing...

14
Mutual Redundancies in Interhuman Communication Systems: Steps Toward a Calculus of Processing Meaning Loet Leydesdorff Amsterdam School of Communication Research (ASCoR), University of Amsterdam, Kloveniersburgwal 48, 1012 CX Amsterdam, The Netherlands. E-mail: [email protected] Inga A. Ivanova Department of International Education & Department of Economics and Production Management, Far Eastern Federal University, Office 514, 56 Aleutskaya Street, Vladivostok 690950, Russia. E-mail: [email protected] The study of interhuman communication requires a more complex framework than Claude E. Shannon’s (1948) mathematical theory of communication because “infor- mation” is defined in the latter case as meaningless uncertainty. Assuming that meaning cannot be commu- nicated, we extend Shannon’s theory by defining mutual redundancy as a positional counterpart of the relational communication of information. Mutual redundancy indi- cates the surplus of meanings that can be provided to the exchanges in reflexive communications. The information is redundant because it is based on “pure sets” (i.e., without subtraction of mutual information in the over- laps). We show that in the three-dimensional case (e.g., of a triple helix of university–industry–government rela- tions), mutual redundancy is equal to mutual information (Rxyz = Txyz); but when the dimensionality is even, the sign is different. We generalize to the measurement in N dimensions and proceed to the interpretation. Using Niklas Luhmann’s (1984–1995) social systems theory and/or Anthony Giddens’s (1979, 1984) structuration theory, mutual redundancy can be provided with an inter- pretation in the sociological case: Different meaning- processing structures code and decode with other algorithms. A surplus of (“absent”) options can then be generated that add to the redundancy. Luhmann’s “func- tional (sub)systems” of expectations or Giddens’s “rule- resource sets” are positioned mutually, but coupled operationally in events or “instantiated” in actions. Shannon-type information is generated by the mediation, but the “structures” are (re-)positioned toward one another as sets of (potentially counterfactual) expecta- tions. The structural differences among the coding and decoding algorithms provide a source of additional options in reflexive and anticipatory communications. Introduction This study originated from the puzzle caused by the pos- sible change in sign when mutual information is measured in more than two dimensions. Mutual information in more than two dimensions has been considered as a signed information measure (Yeung, 2008), but it has remained poorly under- stood given the Shannon framework that allows only for forward transmission of information in messages and the positive generation of probabilistic entropy (Krippendorff, 2009a, 2009b). In the discussions about extending the number of helices beyond a triple helix of university– industry–government relations to quadruple, quintuple, or n-tuple sets of helices, the problem of possible sign changes in this operationalization had remained unresolved (Kwon, Park, So, & Leydesdorff, 2012; Leydesdorff & Sun, 2009; cf. Carayannis & Campbell, 2009, 2010; Leydesdorff, 2012a). When the number of possible options by cross-tabling the different codes of communication in the various helices increases, the uncertainty relative to the maximum informa- tion content can decrease. This possible reduction of uncer- tainty (“synergy;” Leydesdorff & Strand, 2013) in the case of an increasing number of options has sociological rel- evance since both uncertainty and expectations are involved in interhuman communications. How can the rich domain of interhuman—reflexive—communications be modeled so that the methodological apparatus of Shannon’s (1948) mathematical theory of communication can fruitfully be applied? (e.g., Krippendorff, 1986; Theil, 1972). Theoretical Background In the final chapter of his study entitled “Toward a Sociological Theory of Information,” Harold Garfinkel Received January 29, 2013; revised March 12, 2013; accepted March 13, 2013 © 2013 ASIS&T Published online in Wiley Online Library (wileyonlinelibrary.com). DOI: 10.1002/asi.22973 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, ••(••):••–••, 2013

Transcript of Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing...

Page 1: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

Mutual Redundancies in Interhuman CommunicationSystems: Steps Toward a Calculus ofProcessing Meaning

Loet LeydesdorffAmsterdam School of Communication Research (ASCoR), University of Amsterdam, Kloveniersburgwal 48,1012 CX Amsterdam, The Netherlands. E-mail: [email protected]

Inga A. IvanovaDepartment of International Education & Department of Economics and Production Management, Far EasternFederal University, Office 514, 56 Aleutskaya Street, Vladivostok 690950, Russia. E-mail: [email protected]

The study of interhuman communication requires a morecomplex framework than Claude E. Shannon’s (1948)mathematical theory of communication because “infor-mation” is defined in the latter case as meaninglessuncertainty. Assuming that meaning cannot be commu-nicated, we extend Shannon’s theory by defining mutualredundancy as a positional counterpart of the relationalcommunication of information. Mutual redundancy indi-cates the surplus of meanings that can be provided to theexchanges in reflexive communications. The informationis redundant because it is based on “pure sets” (i.e.,without subtraction of mutual information in the over-laps). We show that in the three-dimensional case (e.g., ofa triple helix of university–industry–government rela-tions), mutual redundancy is equal to mutual information(Rxyz = Txyz); but when the dimensionality is even, the signis different. We generalize to the measurement in Ndimensions and proceed to the interpretation. UsingNiklas Luhmann’s (1984–1995) social systems theoryand/or Anthony Giddens’s (1979, 1984) structurationtheory, mutual redundancy can be provided with an inter-pretation in the sociological case: Different meaning-processing structures code and decode with otheralgorithms. A surplus of (“absent”) options can then begenerated that add to the redundancy. Luhmann’s “func-tional (sub)systems” of expectations or Giddens’s “rule-resource sets” are positioned mutually, but coupledoperationally in events or “instantiated” in actions.Shannon-type information is generated by the mediation,but the “structures” are (re-)positioned toward oneanother as sets of (potentially counterfactual) expecta-tions. The structural differences among the coding anddecoding algorithms provide a source of additionaloptions in reflexive and anticipatory communications.

Introduction

This study originated from the puzzle caused by the pos-sible change in sign when mutual information is measured inmore than two dimensions. Mutual information in more thantwo dimensions has been considered as a signed informationmeasure (Yeung, 2008), but it has remained poorly under-stood given the Shannon framework that allows only forforward transmission of information in messages and thepositive generation of probabilistic entropy (Krippendorff,2009a, 2009b). In the discussions about extending thenumber of helices beyond a triple helix of university–industry–government relations to quadruple, quintuple, orn-tuple sets of helices, the problem of possible sign changesin this operationalization had remained unresolved (Kwon,Park, So, & Leydesdorff, 2012; Leydesdorff & Sun, 2009;cf. Carayannis & Campbell, 2009, 2010; Leydesdorff,2012a).

When the number of possible options by cross-tabling thedifferent codes of communication in the various helicesincreases, the uncertainty relative to the maximum informa-tion content can decrease. This possible reduction of uncer-tainty (“synergy;” Leydesdorff & Strand, 2013) in the caseof an increasing number of options has sociological rel-evance since both uncertainty and expectations are involvedin interhuman communications. How can the rich domain ofinterhuman—reflexive—communications be modeled sothat the methodological apparatus of Shannon’s (1948)mathematical theory of communication can fruitfully beapplied? (e.g., Krippendorff, 1986; Theil, 1972).

Theoretical Background

In the final chapter of his study entitled “Toward aSociological Theory of Information,” Harold Garfinkel

Received January 29, 2013; revised March 12, 2013; accepted March 13,

2013

© 2013 ASIS&T • Published online in Wiley Online Library(wileyonlinelibrary.com). DOI: 10.1002/asi.22973

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, ••(••):••–••, 2013

Page 2: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

(1952/2008)—the founding father of ethnomethodology—listed no fewer than 92 “Problems and Theorems” that wereat the time awaiting elaboration at the interface betweensociology and information theory. Like others (e.g., histeacher Talcott Parsons), Garfinkel was acutely aware thaton the basis of the work of the American pragmatists(Herbert Mead, William James) and European phenom-enologists (Edmund Husserl, Alfred Schütz), the study of“large scale self-regulating systems” (p. 208) was placed onthe agenda of sociology by then-recent advances in infor-mation theory by von Neumann and Morgenstern (1944),Shannon and Weaver (1949), Deutsch (1951), Miller (1951),Ruesch and Bateson (1951), and others.

Most authors in the sociological tradition work within aparadigm where information cannot be considered content-free and meaningless, as it is defined in Shannon’s (1948)mathematical theory of communication. Shannon definedinformation as “uncertainty”—operationalized as probabi-listic entropy—and not as “informative” in the sense ofreducing uncertainty.1 His coauthor and commentator,Warren Weaver, called this definition “bizarre,” but “poten-tially so penetratingly clearing the air that one is now,perhaps for the first time, ready for a real theory of meaning”(p. 27). However, a theoretical impasse was generated thatwould last for decades (cf. MacKay, 1969): On one side,Shannon’s summations of probabilities and their logarithmscannot add up to “a difference that makes a difference” for asystem of reference (e.g., an observer; Bateson, 1972, p.315), and on the other side, social scientists are not inclinedto abandon “action” or “an observer” as the system of ref-erence.

Hayles (1990) compared these diametrically opposeddefinitions of information—as uncertainty or as reduction ofuncertainty—to defining a glass as half full or half empty.She mentioned Weaver’s (1949, p. 26) proposal to introducethe communication of “meaning” as “another box in thediagram which, inserted between the information source andthe transmitter, would be labeled ‘semantic noise,’ the boxpreviously labeled as simply ‘noise’ now being labeled‘engineering noise’ ” (1990, p. 193; see Figure 1).

The communication of (Shannon-type) information neces-sarily adds uncertainty because Shannon (1948) definedinformation as probabilistic entropy: H p pi ii

= − ∗∑ log2 .Probabilistic entropy is coupled to Gibbs’s formula for thermo-dynamic entropy, S = kB*H. In Gibbs’s equation, kB is the Bolt-zmann constant that provides the dimensionality Joule/Kelvinto S while H is dimensionless and can be measured in bits ofinformation (or any other base for the logarithm) given a prob-ability distribution (containing uncertainty). The second law ofthermodynamics states that entropy increases with each opera-tion, and Shannon-type information is therefore always positive(because kB is a constant).

Unlike information that is transmitted from a sender to areceiver in accordance with the arrow of time, meaning is

provided from the perspective of hindsight (i.e., against thearrow of time). In other words, meaning operates incursively(Dubois, 1998) (i.e., reflexively with reference to the presentstate) whereas information is generated recursively (i.e.,with reference to a previous state, such as as a sender for areceiver). A local reversal of the time axis within a system iscompatible with the second law if the entropy increases atthe aggregated level of the total system (e.g., Wicken, 1989).Local pockets of negative entropy (e.g., niches) are thuspossible within the entropy flow.

In summary, Weaver’s (1949) “semantic noise” may addmore to the redundancy than to the uncertainty when thecommunication is reflexive. In Weaver’s formulation: “This[redundancy] is the fraction of the structure of the messagewhich is determined not by the free choice of the sender, butrather by the accepted statistical rules governing the use ofthe symbols in question” (p. 13). From this perspective,semantics and its rules can function as sources of redun-dancy that add to the maximum entropy by enlarging thenumber of options. The reflexive animal symbolicum(Cassirer, 1923/2010) operates by entertaining a symbolicorder (Deacon, 1997) that multiplies the envisaged possibili-ties to interpret events. When the redundancy increasesfaster than the uncertainty, the latter can be reduced in anevolving system while nevertheless respecting the secondlaw (Brooks & Wiley, 1986, p. 42).

Self-organizing systems use energy to reduce uncertaintyinternally. Based on Maturana’s (1978; cf. Maturana &Varela, 1980) biological theory of self-organization or auto-poiesis, Luhmann (1986a, 1984/1995) proposed consideringthe communication of meaning as the specific operation ofthe social system. From this perspective, the operation of theinterhuman communication system is different from, but“structurally coupled” with, the processing of meaning inindividual “consciousness” at the actor level. “Structurallycoupled” means that the one system cannot operate withoutthe other as its directly relevant environment: Luhmann’ssocial systems and individual consciousness embed eachother reflexively (using “interpenetration;” Luhmann, 1991,2002), but the control mechanisms in the two systems aredifferent. Whereas the individual as a unity can be expectedto strive for integration and the organization of different

1Varela (1979, p. 266) noted the original etymology of “in-formare”(Horn & Wilburn, 2005).

FIG. 1. Schematic diagram of a general communication system. Source:C.E. Shannon, 1948, p. 380); with Weaver’s box of “semantic noise” added.

2 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013DOI: 10.1002/asi

Page 3: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

meanings in one’s mind, society can tolerate functionaldifferentiation and asynchronicity among subsystems ofmeaning processing with other degrees of freedom. Politicaldiscourse, for example, can be coupled to scholarly dis-courses to varying extents, but the two communication cir-cuits do not have to be updated by each other at each step.

In accordance with the sociological and (most of the)cybernetic tradition, Luhmann (1984, pp. 102–105/1995, pp.67–69) defined “information” as the reduction of complexityfor a selecting (e.g., observing) system, with explicit refer-ence to Bateson’s (1972) definition of information as “adifference that makes a difference” (p. 315). The communi-cation of meaning is thus uncoupled from measurement(Brillouin, 1962) in terms of Shannon-type information.Shannon’s theory assumes that only information defined asuncertainty can be communicated whereas Luhmann (1990)assumes that only meaning can be communicated: The inter-human communication system is defined as communicatingmeaning. Consequently, Luhmann (1995) concluded that“the most important consequence of this analysis is thatcommunication cannot be observed, only inferred” (p. 164;italics in the original). Thus, this approach tends to lose theoperational perspective of measuring communications interms of bits of information.

Research Problem

In this study, we try to solve the aforementioned impasseby distinguishing between the communication of informa-tion as a relational operation and the positioning of recursiveloops that can be generated within communication net-works when the information is provided with meaning(Leydesdorff, 2012b). In our opinion, the “communicationof meaning” cannot be observed as historical because onlyShannon-type information can be communicated in the(measurable) engineering box. However, the processing ofmeaning among reflexive agents can be considered asadding Weaver’s (1949) semantic box to the communicationas a feedback arrow and thus ceteris paribus expectedto reduce the prevailing uncertainty by increasing themaximum entropy.

In other words, this reduction of the uncertainty finds itsorigin in the increase of the number of options (N) whencodes can operate upon one another, and accordingly thedecrease of all probabilities expressed as relative frequen-cies (ù/N). The different codes can be considered asgenerating different “alphabets” of which the character setscan be cross-tabled. The number of options is then multi-plied. The processing of meaning can thus leave an imprintas a feedback by reducing the relative uncertainty or, inother words, increasing the redundancy in the entireconfiguration.

The Shannon-type mutual information can first be calcu-lated in two dimensions. If one extends the formulas to threeor more dimensions, however, the resulting value can nolonger be considered as (Shannon-type) information(i.e., uncertainty). We shall argue that this value can be

considered as the effect of an increase of the redundancy.Mutual redundancy can be considered as the positionalcounterpart of the generation of uncertainty in relationalcommunications. In the three-dimensional case (e.g., a triplehelix of university–industry–government relations), mutualredundancy is equal to mutual information, but when thedimensionality is even, the sign is different. We generalize tothe measurement in N dimensions. This approach allows usto extend Shannon’s (1948) mathematical theory of infor-mational communications toward the measurement ofexchanges of meaning in terms of different positions.Different positions imply different angles for the reflectionof the relational events.

A Signed Information Measure

Let us proceed to the formalization. The measurement ofa negative imprint in terms of bits of information requires asigned information measure. As noted earlier, all Shannon-measures are necessarily positive because of the coupling tothe second law of thermodynamics. From this perspective,however, one seemingly inconsistent consequence of Shan-non’s formulas has remained in the case of mutual informa-tion in three or more dimensions because the latter can benegative (e.g., McGill, 1954; Yeung, 2008, p. 59f). Garnerand McGill (1956) compared mutual information in threedimensions with the possibility of negative variance in thecase of three sources of variance.

The mutual information or transmission (T) between twovariables x and y is defined (by Shannon) as the differencebetween the uncertainty in a probability distribution of a

discrete random variable x (i.e., H p px x xx= − ∗∑ log2 and

the uncertainty in the conditional probability distribution ofx given y ( H p px y x y x yx y| | ||

log= − ∗∑ 2 ). In formula format:

T H Hxy x x y= − | (1)

or equivalently:

T H H Hxy x y xy= + − (2)

Rewriting this formula as Hxy = Hx + Hy - Txy makes possiblea graphic and set-theoretical interpretation using Figure 2.

The total uncertainty in x and y (Hxy) is equal to theuncertainty in x plus the one in y; but one has to subtractthe uncertainty in the overlap because this uncertaintywould otherwise be counted twice. The uncertainty in the

FIG. 2. Overlapping uncertainties in x and y.

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013 3DOI: 10.1002/asi

Page 4: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

overlap also can be considered as mutual information ortransmission (Txy).

It follows from this (e.g., Abramson, 1963, p. 129) thatmutual information in the case of three variables x, y, and z,or, more generally, 1, 2, and 3 is:2

T H H H H H H H123 1 2 3 12 13 23 123= + + − − − + (3)

Using a graphic representation analogous to Figure 2, onecan reason as follows using the additive nature of theentropy function in Figure 3.

If one subtracts each of the three mutually overlappingareas in Figure 3 once, the total uncertainty (H123) has to becorrected first for each overlap because of double counting,but then the three-way intersection is three times subtractedand consequently no longer included. Therefore, this term(T123) has to be added after the subtractions. The total uncer-tainty is then:

H H H H T T T T123 1 2 3 12 13 23 123= + + − − − + (4)

As in Equation 2:

T H H H12 1 2 12= + − , (5)

and T13, T23, mutatis mutandis.The substitution of Equation 5 into Equation 4 leads to:

H H H H H H HH H H H H H T

123 1 2 3 1 2 12

1 3 13 2 3 23 123

= + + − + −( )− + −( ) − + −( ) + (6)

T H H H H H H HH H H H H H

H

123 123 1 2 3 1 2 12

1 3 13 2 3 23

1

= − − − + + −( )+ + −( ) + + −( )

= + HH H H H H H2 3 12 13 23 123+ − − − +(7)

The result of Equation 7 is identical to Equation 3, but thederivation is different since it is expressed in terms of thesets and subsets of Figure 3.

Analogously, it follows for higher dimensionality(e.g., four):

T H H H H H H H H HH H H H H

1234 1 2 3 4 1 12 13 14 23

24 34 123 124 13

= + + + + − − − −− − + + + 44 234 1234+ −H H

(8)

Note that the sign alternates each time for the lastly added(i.e., highest order) term. However, the different signs withineach equation (as in Equations 7 and 8) provide the oppor-tunity to distinguish between positive and negative contri-butions to the uncertainty by the mutual information in morethan two dimensions.

Building on Ashby (1969), Krippendorff (2009b, p. 670)provided a general notation for this alteration with changingdimensionality—but with the opposite sign (which furthercomplicates the issue; cf. Leydesdorff, 2010a, p. 68)—asfollows:

Q H XX

XΓ Γ

Γ( ) = −( ) ( )+ −

⊆∑ 1 1(9)

In this equation, G is the set of variables of which X is asubset, and H(X) is the uncertainty of the distribution; |G| isthe cardinality of G, and |X| the cardinality of X.3

With the same enthusiasm which Krippendorff (2009a)reported about Ashby (1969), we embraced this potentiallynegative sign in the mutual information in three dimensionsas an indicator of potential reduction of uncertainty in triple-helix configurations once it was brought to our attention byRobert Ulanowicz, who had used the same indicator in thecontext of his ascendancy theory in mathematical biology(Ulanowicz, 1986, p. 143). This same indicator is usedacross disciplines (for an overview, see Jakulin, 2005) andsometimes called “configurational information,” but it hasremained controversial because it is poorly understood. Asnoted, a signed information measure cannot be interpreted inShannon’s (1948) information theory whereas alternativeframeworks for its appreciation have remained ill-defined(Krippendorff, 1980, 2009a, 2009b).

In general, the interaction between two variables can beconditioned by a third source of variation with which bothvariables can interact in terms of partial and spurious corre-lations. Spurious correlation can reduce uncertainty withoutbeing visible in the data without further analysis—and there-fore is latent (Strand & Leydesdorff, 2013). Partial correla-tion can be measured using conditions in the equation, aswith Pearson’s rxy|z (Sun & Negishi, 2010) or, nonparametri-

2The notation is now changed to 1, 2, 3, . . . , and so on, to allow forfurther extensions beyond three (1, 2, 3, . . . , N). Following Theil (1972),we generalize the notion of entropy as the average amount of information ina set of symbols to entropy statistics that can be used more abstractly for theanalysis of relations among variables.

3This formula for one, two, and three dimensions takes the form(Krippendorff, 1980, 2009a, 2009b):

Q A H A( ) = − ( )

Q AB H A H B H AB( ) = ( ) + ( ) − ( )

Q ABC H A H B H C H AB H AC H BC

H ABC H ABC

( ) = − ( ) − ( ) − ( ) + ( ) + ( ) + ( )− ( ) − ( )

This measure Q alternates in sign (for dimensions of two or more), but withthe opposite sign compared to the mutual information T as used earlier(Leydesdorff, 2010a).

FIG. 3. Overlapping uncertainties in three variables: 1, 2, and 3.

4 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013DOI: 10.1002/asi

Page 5: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

cally, Shannon’s Hxy|z as a measure of uncertainty in twodimensions conditioned by a third, and the correspondingmutual information Txy|z.

A spurious correlation can reduce uncertainty, as whentwo parents reduce uncertainty for a child by providingmutually consistent answers to questions. The answers byone parent inform the child in this case about the expectedanswers of the other because these answers are coordinatedat a systems level. The marriage can be considered as a latentconstruct in which the child plays a role without constitutingit. Note that the reduction of uncertainty cannot be attributedto specific agents but is generated in the network of relationsas a configuration. In the case of divorce, for example,uncertainty among the answers may begin to prevail over thesynergy measured as reduction of uncertainty at the systemslevel.

Similarly, in university–industry–government relations, astrong existing relation between two of the partners canmake a difference for the third (Burt, 2001). The Triple-Helix thesis, however, invites opening up to a fourth, fifth,and next-order helical model (Carayannis & Campbell,2009, 2010; Leydesdorff, 2012a; Leydesdorff & Etzkowitz,2003). More recently, Ye, Yu, and Leydesdorff (in press)argued that globalization can unravel the triple helix in termsof the fourth dimension of local (e.g., national) versus globalorganization. Leydesdorff and Sun (2009), for example,elaborated coauthorship data with institutional addressesin terms of industrial, academic, or governmental authors,with the additional dimension national/international, andconcluded that the unraveling of this system in Japan wascounteracted by retaining reduction of uncertainty frominternational collaborations. However, a similar effect couldnot be found in Canadian or Korean data (Kwon et al.,2012).

One problem with the extension of mutual informationinto three or more dimensions, in our opinion, has remainedthe alternating signs of the equations. What are the conse-quences in empirical research when three positive terms areadded to Equation 7 to obtain Equation 8, but only a singleone (H1234) is subtracted? The likelihood of a sign changebecause of these definitions with alternating signs isconsiderable.

Figure 4 shows this problem empirically using the datanoted earlier about university–industry–government (“uig”)coauthorship relations in Japan combined with internationalcoauthorship relations (using the “f” of “foreign” for anestimate of the non-Japanese addresses involved). Figure 4shows first that over the entire period of 1980 to 2005,university–industry relations (measured as Tui) in twodimensions declined whereas international (university–foreign) relations (measured as Tuf) increased steadily (Notethat transmission also can be considered as a measure ofcovariation.)

The synergy in the national system of triple-helix rela-tions (Tuig) thus eroded, but four-dimensional Tuigf also exhib-ited a systematic trend since the early 1990s (i.e., after thedemise of the Soviet Union and the opening of China). Sucha global change in the systems dynamics since about 1993also was found in comparable studies (e.g., Lengyel &Leydesdorff, 2011; Ye et al., in press) and has been inter-preted as a systems change because of further “globaliza-tion” emerging in the knowledge-based economies after theend of the Cold War (Leydesdorff & Zawdie, 2010).

The remaining question, in our opinion, is whether thispossible decline in Tuigf (in Figure 4) since the early 1990sindicates a decrease or an increase in uncertainty. Having noother theoretical means available, the authors at the timeopted for a decline, but the sign change between the

FIG. 4. Development of mutual relations between authorship with (u)niversity, (i)ndustry, (g)overnmental, and (f)oreign addresses in Japan, 1980–2005;Web of Science data. Source: L. Leydesdorff & Y. Sun, 2009, p. 783. [Color figure can be viewed in the online issue, which is available atwileyonlinelibrary.com.]

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013 5DOI: 10.1002/asi

Page 6: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

three- and four-dimensional information measure remainedworrisome, and this worry became more acute afterKrippendorff’s (2009a, 2009b) critique (also seeLeydesdorff, 2009, 2010a). Krippendorff (2009b) attributedthe erratic behavior of the measure to next-order loops thatgenerate redundancy whereas the transmission values inmore than two dimensions were understood as a differencebetween redundancy generated in next-order loops andShannon-type information processing generating probabilis-tic entropy in the interactions.

In our opinion, Krippendorff’s (2009b) specification ofmutual information in three dimensions as a difference didnot address the conceptual problem of the sign change whenthe dimensionality is augmented. In what follows, wepropose an approach within the framework of the math-ematical theory of communication that answers this ques-tion and further clarifies that meaning cannot becommunicated in the framework of Shannon’s (1948)theory, but that the circulation of different meanings as asecond-order effect of the communication of informationmay increase the redundancy. The elegance of the newmeasure of meaning-exchange is that it does not affect thesign in the three-dimensional case but harmonizes all higherdimensional cases in accordance with this interpretation.

Generation of Redundancy

The mutual transmission in the communication of infor-mation reduces the uncertainty that prevails using Equations1 and 2; but let us assume that new options can be added ifmeaning processing is additionally involved. Expansion ofthe number of options (N) adds to the maximum entropy:Hmax = log(N). Given a prevailing uncertainty H, the redun-dancy R expands since the latter is defined as the fraction ofthe capacity that is not used for the uncertainty that prevails.In formula format:

RH

H

H H

H

= −

= −

1max

max

max

(10)

We focus on the numerator in Equation 10. Redundancy isthen the complement of relative uncertainty: The two add upto the maximum (100% of) entropy. This technical definitionof redundancy may seem as counterintuitive as Shannon’s(1948) definition of “information” since redundancy isintuitively associated with the overlapping information inconsecutive messages.

Whereas the assumed increase in redundancy is gener-ated in relation to the information exchanges, it remainsredundancy—analytically different from uncertainty—despite this operational coupling. Because we, as analysts,have no information about the size of the additional redun-dancy, let us for reasons of parsimony assume that the newlygenerated redundancy is equal in absolute size to the trans-mission. We can thus define an “excess” information value

Y12—equivalent to H12, but with the plus sign since wedo not correct for the duplication in the case ofredundancies—as follows:

Y H

Y H

Y H H T H T

1 1

2 2

12 1 2 12 12 122

=== + + = +

(11)

Replacement of H12 with Y12 into Equation 2 givesanalogously the formula for the mutual redundancy R12

as follows:

R H H Y12 1 2 12= + − , (12)

and it follows (using Equation 11) that:

R H H H TH H H H T T T

12 1 2 12 12

1 2 1 2 12 12 12

22

= + − +( )= + − + −[ ]+( ) = − (13)

Since T12 is necessarily positive (Theil, 1972, p. 59 ff.), itfollows from Equation 13 that R12 is negative and thereforecannot be anything else other than the consequence of anincreased redundancy. This reduction of the uncertainty ismeasured in bits of information—that is, as uncertainty—and therefore the sign is negative as a subtraction(Leydesdorff, 2010a, p. 68). In other words, our mathemati-cal assumption leads to a mutual redundancy generated atthe next-order level equal to the transmission in the commu-nication, but with the opposite sign. Consequently, R12 canbe expressed in terms of negative amounts (e.g., bits) ofinformation (i.e., as reduction of uncertainty).

Interestingly, the formulation in Equation 11 is preciselyequivalent to working with “pure sets” (e.g., as in the case ofthe document sets in Figure 2). One no longer corrects forthe overlap—the mutual information—but assumes that thetwo sets influence each other at a next level using anothermechanism; namely, in terms of positions. The overlap (inFigure 2) is thus not subtracted but counted twice and istherefore redundant. The influencing at this next levelamong pure sets is different from the relational interaction atthe network level that generates mutual information. Themutual redundancy can be considered as a consequence ofthe positions of the communicating systems in relation toone another, given the possible communication of informa-tion between them. Mutual redundancy R measures thesurplus of options that are generated when meaning-processing systems communicate in terms of the informa-tion exchange.

Without an information exchange, T12 is zero; therefore,R12 = -T12 also is zero. The meaning exchange is instantiatedby the information exchange as the carrying operation, butremains a positional potential. The field metaphor is usefulhere: The two sets influence each other by radiating, butdisturb each other only to the extent that energy is possiblytransferred. Otherwise—for example, in the case of largedistances—the sets remain irrelevant black boxes to eachother.

6 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013DOI: 10.1002/asi

Page 7: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

For the three-dimensional case and using Figure 3, wecan define, in addition to the two-dimensional values of Y (inEquation 11), a three-dimensional value as follows:

Y H H H T T T T123 1 2 3 12 13 23 123= + + + + + + (14)

Information-theoretically (on the basis of Equation 6),however:

H H H H T T T T123 1 2 3 12 13 23 123= + + − − − + , (15)

and it therefore follows that the difference is:

Y H T T T

Y H T T T123 123 12 13 23

123 123 12 13 23

2 2 2

2 2 2

− = + + += + + +

(16)

Using Y values instead of H values for joint entropies inEquation 3, one obtains:

R H H H H T H TH T H T

123 1 2 3 12 12 13 13

23 23 123 12

2 22 2 2

= + + − +( ) − +( )− +( ) + + + TT T T13 23 1232+( ) = (17)

In the three-dimensional case, the mutual redundancy isidentical to the previously studied mutual information inthree dimensions. This explains why the mutual informationin more than two dimensions can be negative and thereforecannot be Shannon-type information.

In other words, in the case of measuring the mutualinformation in three dimensions, one never measuredShannon-type information but this mutual redundancyamong the sets. The possible reduction of uncertainty—ifthis measure provides a negative amount (e.g., bits) ofinformation—also can be considered as a measure of thesynergy among three sources of variance when specificallypositioned in relation to one another. These sources arepositioned in terms of their relations in the networks thatcarry the systems in terms of operations. The network ofrelations span an architecture in which all links and nodeshave a position.

Etzkowitz and Leydesdorff (2000), for example, used themetaphor of an overlay of communications or “hypercycle”(Leydesdorff, 1994) in university–industry–governmentrelations. The synergy measure used as a triple-helix indi-cator can alternatively be named “configurational informa-tion” or Yeung’s (2008) signed information measure thatwas denoted by him with m*. In our opinion, Yeung’s m* canbe considered as mutual redundancy; if R123 < 0 in bits (orany other measure) of information, the prevailing uncer-tainty is reduced. Ulanowicz (2009) elaborated on thismeasure using the metaphor of autocatalysis: The third vari-able can operate as a catalyst on the two others by closing aloop among the three. A spurious correlation can then begenerated autocatalytically.

Let us now extend to the four-dimensional case bydrawing Figure 5, as follows:

We subtract the following two equations

H H H H H T T T T TT T T T T

1234 1 2 3 4 12 13 14 23 24

34 123 124 134

= + + + − − − − −− + + + + 2234 1234− T

(18)

Y H H H H T T T T TT T T T T

1234 1 2 3 4 12 13 14 23 24

34 123 124 134

= + + + + + + + ++ + + + + 2234 1234+ T

(19)

with the following result:

Y H T T T T TT T

1234 1234 12 13 14 23 24

34 1234

2 2 2 2 22 2

= + + + + ++ + (20)

Using Y values instead of H in Equation 8 for the jointentropies, we obtain (using also Equations 11 and 16 for thesubstitution):

R H H H H H T H TH T H T

1234 1 2 3 4 12 12 13 13

14 14 23 2

2 22 2

= + + + − +( ) − +( )− +( ) − + 33 24 24

34 34 123 12 13 23

124 12

22 2 2 22

( ) − +( )− +( ) + + + +( )+ +

H TH T H T T TH T ++ +( ) + +(

+ + ) + + + +( )−

2 2 22 2 2 2 2

14 24 134 13

14 34 234 23 24 34

T T H TT T H T T THH T T T T TT T

H H H H

1234 12 13 14 23 24

34 1234

1 2 3 4

2 2 2 2 22 2

+ + + + +(+ + )

= + + + − HH H H H HH H H H H H T

12 13 14 23 24

34 123 124 134 234 1234 12342− − − −

− + + + + − +( ))= −= −

T TT1234 1234

1234

2

(21)

In summary, we can see that the problem of the change ofsign is resolved by using R instead of T since:

R T12 12= − (22)

FIG. 5. Overlapping uncertainties in four variables: 1, 2, 3, and 4(Rousseau, 1998; Durrett, s.d.).

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013 7DOI: 10.1002/asi

Page 8: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

R T123 123= (23)

R T1234 1234= − (24)

Equations 22, 23, and 24 show first that R can have a nega-tive sign; Equation 22 warrants this in the two-dimensionalcase. Although (Shannon-type) information and redundancyare positive, we can measure mutual redundancy also asreduction of uncertainty. In the three-dimensional or, moregenerally, the odd-dimensional cases, the mutual informa-tion among these dimensions is defined as equal to themeasurement of the mutual redundancy, but in the even-dimensional cases the sign of this redundancy is opposite tothe mutual information.

In summary, the alternating sign is thus explained byassuming that the mutual positioning of reflexive communi-cation systems in terms of relations can generate mutualredundancy that is (technically) equal in size to the (previ-ously defined) mutual information in the relations.Krippendorff’s (2009a) insistence on the interaction infor-mation IABC→AB:AC:BC as the Shannon-type information inthree dimensions different from TABC can then be fullyappreciated. The signed information measure of Yeung’s m*always has measured a non-Shannon redundancy R (albeitwith a potential sign change depending on the dimensional-ity) whereas Krippendorff’s (2009a) interaction informationcan be considered a Shannon-type measure.4

The Interpretation

Our information-theoretical interpretation first followsKrippendorff’s (2009b) assertion “that interactions withloops entail positive or negative redundancies, those withoutloops do not” (p. 676; italics in the original). Mutual infor-mation in three or more dimensions was considered byKrippendorff (2009b) as the difference between redundancythus generated and the Shannon-type information generatedin the interactions; or, to use Weaver’s (1949) terminology,between noise generated in the engineering box and the(next-order) semantic noise-generator.

We follow Weaver’s (1949) interpretation by consideringthe two boxes (of engineering noise and semantic noise inFigure 1) as operating in parallel, with a potential net resultat the level of the network channel. The two operations areanalytically independent, but mutual redundancy can begenerated only in relations (i.e., when the channel operates).Both boxes can be considered as conditions on the informa-tion exchange, but mutual redundancy adds to the maximumentropy of the channel. In other words, it changes thechannel capacity and is not a Shannon-type information thatassumes a fixed channel. This expansion of the maximumentropy may more than compensate for the interaction

information that is generated relationally (Brooks & Wiley,1986; Leydesdorff, 2011). Relational operation is a neces-sary condition for the generation of mutual redundancy.Note that there also can be other sources of redundancy, suchas through codification of the communication within asystem (discussed later).

Following Weaver (1949), one can assume that meaningprocessing can add to the redundancy, but this “semanticnoise” had not been specified. Our proposal is that forreasons of parsimony, the mutual redundancy can be math-ematically defined as equal to the mutual information in therespective number of dimensions, but with a possible cor-rection for the sign as specified earlier. However, redundan-cies do not relate operationally but only condition therelational operation. From a set-theoretical approach, there-fore, the sets were first considered as independent (“pure” ororthogonal). The pure sets can be redundant in terms of theirinformation content, and this can structure the relations as“semantic noise” (Figure 1). Whether such structuring addsto or reduces the uncertainty in the total system of variablesis an empirical question.

The choice of this value for and sign of the mutualredundancy is convenient because:

1. In the three-dimensional case, R123 = T123 and thus thesigned information measure that has been used in empiri-cal studies throughout the literature remains a validmeasure, albeit with a different interpretation. If R123 < 0,uncertainty is reduced, and if R123 > 0, uncertainty isincreased.

2. R12 = -T12 in the two-dimensional case; R12 is necessar-ily � 0, since T12 � 0 (Theil, 1972, p. 59f); measurementof R12 as reduction of uncertainty makes mutual redun-dancy consistent with Shannon’s theory.

3. In the N-dimensional case, the sign of the mutual infor-mation T would alternate, but the sign of the mutualredundancy R remains consistent.

4. This reasoning does not affect the specification ofKrippendorff’s (2009b) “interaction information”(IABC→AB:AC:BC) as a Shannon-type information in three ormore dimensions; R is not a Shannon-type informationbut the calculus thus remains consistent with Shannon’s(1948) mathematical theory of communication.

Let us proceed to a systems-theoretical appreciation. From aShannon-perspective, meaning cannot be communicatedrelationally and only information can be communicatedbetween a sender and a receiver. Bateson’s (1972, p. 315)“difference which makes a difference,” however, is alwaysprovided with reference to a system. By acknowledging thedifference made, the incoming (Shannon-type) uncertaintyis positioned within a receiving system, and thus providedwith local meaning. Unlike information processing that isbased on operational coupling in relations, meaning process-ing operates as a next-order loop in terms of positions thatinfluence one another as fields.

For didactic purposes, the difference between mutualredundancy between systems with different positions and

4Krippendorff’s (2009a) approximation of this value can be extended tomore than three dimensions using Occam Version 3.3.10 at http://occam.research.pdx.edu/occam/weboccam.cgi?action=search (Willett &Zwick, 2004; cf. Krippendorff, 1986).

8 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013DOI: 10.1002/asi

Page 9: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

mutual information in their relations can be compared withthe difference between the additive and subtractive mixingof colors: Red, green, and blue (or cyan, magenta, andyellow) add up to white additively—for example, in terms ofthree lamps that are positioned differently—but the resultof operational mixing of these three colors (relationally)leads to subtractive color mixing and ultimately to black(Figure 6).

Certain elements of Luhmann’s (1984/1995) socialsystems theory also can be helpful for explaining this func-tioning of meaning processing in terms of positions. Basedon Parsons’s (1968) theory of the symbolic generalization ofmedia, as in the case of money and power, Luhmann con-sidered functional differentiation among the (symbolicallygeneralized) media to be based on different codes operatingin subsystems of communication. For example, he suggestedtrue/false as the code of communication in scientific com-munication versus transactions (“payment”) as the code onthe market (Luhmann, 1997). Parsons (1963a, 1963b) dis-tinguished between “power” and “influence” in these terms,and one also can consider affection (“love”) as a specificcoding of the communication (Luhmann, 1986b). The codesthus color the communications like colors of light withoutyet (analytically) implying the necessity of a relational(“operational”) coupling or mixing. Without communica-tion, however, the codes cannot operate.

Following Maturana’s (e.g., 1978) theory of autopoiesis,Luhmann (1984/1995; 1986a) originally argued that the“function systems”—specifically coded communication(sub)systems—are operationally closed and structurallycoupled. This metaphor is (meta-)biological: A functionallydifferentiated cell in a bodily organ such as the liver cannotperform a lung-cell function once differentiation has takenplace (in the gastrula stage of embryonic development), butboth cells are structurally part of the same body. Function-ally differentiated systems operate in parallel, but they caninfluence each other in terms of operational couplings at thenetwork level (e.g., via the bloodstream).

In response to criticism, Luhmann (1997, p. 788) addedthat communications can be operationally coupled in eventswhereas the same communication can be provided with

different meanings at different places. He elaborated theexample of a physician who provides an employee with anote that legitimates absence from the workplace because ofa health condition. In this event, the health system is opera-tionally coupled to the economic system, but the meaningsare updated in each of the different subsystems that areasymmetrically coupled in the operation. One also can con-sider the coupling as a “translation” or an “overflow”(Callon, 1998).

In terms of triple-helix relations, for example, patents canbe considered as output of research and developmentsystems and inputs to the economy. They serve at the sametime as intellectual property protection. The three differentinstitutional spheres are recombined in the specific organi-zation of a patent as an instant in the relevant flows. In otherwords, each relational operation is contingent and historical:Mutual information is generated in the interaction that isnecessarily organized in (and mediated by) historical events.The functional subsystems, however, self-organize in termsof updating their information contents at the level of theirrespective systems using specific codes. New meanings canbe generated as possible options or, in other words, redun-dancies in overlaps. The subsystems are positioned in rela-tion to one another in a function space that is different fromthe network space in which the operations are performedrelationally (Bradshaw & Lienert, 1991; Simon, 1962; cf.Leydesdorff, in press).

Simon (1973, pp. 19ff) argued that complex systems canbe expected to operate with alphabets. In other words, thecodes of communication can be considered as alphabets.However, the number of coding systems can grow (orshrink) over historical times. Parsons’ structural-functionalism could thus be historicized (Merton, 1957;Mitroff, 1974). More codes allow for more complexity in thecommunication by generating more possible redundancies.In each event (or “instantiation;” Giddens, 1979, e.g., atp. 64), however, specific codes are (de-)selected more thanare others, and thus different mixtures of meanings are con-tinuously generated over historical time.

Note that self-organization is an evolutionary dynamicshypothesized at a transhistorical level—“genotypical” inevolutionary terms (Hodgson & Knudsen, 2011)—that can beexpected to leave “phenotypical” (i.e., observable) imprintson the historical and observable organization of events. Thesupra-individual level of intentionality, however, remains anorder of expectations that can fruitfully be hypothesized, butcannot be observed directly (Leydesdorff, 2012b).

Giddens (1979) expressed this relation between “absent”redundancies and “present” relations so well that his textmerits quotation in this context:

That is to say, the differences which constitute social systemsreflect a dialectic of presences and absences in space and time.But these are only brought into being and reproduced via thevirtual order of differences of structures, expressed in theduality of structure. The differences that constitute structures,and are constituted structurally, relate “part’” to “whole” in the

FIG. 6. Additive and subtractive color mixing. [Color figure can beviewed in the online issue, which is available at wileyonlinelibrary.com.]

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013 9DOI: 10.1002/asi

Page 10: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

sense in which the utterance of a grammatical sentence presup-poses the absent corpus of syntactical rules that constitute thelanguage as a totality. The importance of this relation ofmoment and totality for social theory cannot be exaggerated,since it involves a dialectic of presence and absence which tiesthe most minor or trivial forms of social action to structuralproperties of the overall society (and, logically, to the develop-ment of mankind as a whole). (p. 71)

As Giddens (1979) went on to explain, institutions thereforedo not work “behind the backs” of the social actors whoproduce and reproduce them as structured expectations ininteractions.

From a communication-theoretical (as distinct from Gid-dens’s action-oriented) perspective, Luhmann (1975, 1986c)specified three levels of “structuration:” (a) interactions atthe micro level that can be expected to encompass poten-tially all functions—the full spectrum—of interhuman com-munications; (b) aggregates of communication can partiallyand selectively integrate differently coded communicationsrelationally at specific (historically observable!) interfaces;and (c) at the level of society, events can be appreciated andcoded differently depending on the macro-structures thathave been developed in interhuman communications asstructures of expectations. For example, the rule of law andthe functioning of the market—as an invisible hand operat-ing at the supra-individual level—are two such counterfac-tual coordination mechanisms operating among us in termsof mutual expectations.

When information is communicated between (sub)systems using different codes—that is, the encoding anddecoding of the messages are performed with different algo-rithms—then the received message can be given a variety ofmeanings, and thus the number of options is rapidly enlarged(Leydesdorff, 1997). The maximum uncertainty of two inter-acting systems thus can be expected to increase with thisredundancy, but not automatically since mutual redundancyis only generated in configurations of relations.

The mutual redundancy adds semantic noise to the engi-neering noise. Our proposal is to estimate this (potentiallynegative) contribution to the uncertainty as equal tothe mutual information—because one has no otherinformation—but with the opposite sign in the case of aneven number of dimensions. The advantage of our definitionof mutual redundancy in terms of pure sets is its consistencywith Shannon’s (1948) framework and the elaboration ofWeaver’s (1949) intuitions in this context (Shannon &Weaver, 1949). Furthermore, this extension of the math-ematical theory of communication resolves the problem ofthe alternating sign in the mutual information in more thantwo dimensions, and can be interpreted in terms of pure setsthat operate concurrently and thus can be redundant (sincerepetitive) from a system’s perspective. The redundantsignals always form a surplus in the communication. Mutualinformation in more than two dimensions has thus far beenmisunderstood as an information measure; it is actually ameasure of mutual redundancy.

Double Contingency

One should remain aware of the confusing metaphorswhen mixing the information-theoretical with Luhmann’s(e.g., 1986a) communication-theoretical perspective. Whenone uses “information” and “communication” with refer-ence to Shannon’s (1948) information theory, the conceptsare defined without substantive meaning as yet and can bemeasured, for example, in bits of information. The specifi-cation of a system of reference provides the communicationwith dimensionality and meaning. For example, one cancommunicate with words or with money. Both financialtransactions and co-occurrences of words can be measuredin terms of bits of information (Leydesdorff, 1995).

As noted, sociologists (including Luhmann) use “com-munication” specifically for interhuman communication. Ininterhuman communication, information is exchanged, butintentionality also can be expected to play a role (Husserl,1929/1973). Meaning, however, cannot be communicateddirectly but only in terms of instantiations in events wheredifferent meanings are related in terms of informationexchanges. Yet, meaning in terms of mutual expectationsalso influences each interhuman communication. If the“semantic noise” were zero, the interhuman communicationcould not be received as meaningful; it might be coded—andthen possibly discarded—as merely engineering noise.

Using different terminologies, 20th-century sociologistsand phenomenologists have tried to express this dualitybetween uncertainty and meaning processing in interhumancommunications. For example, Mead (1934) distinguishedbetween “I” and “me”—the individual being at the sametime both subject and object; Husserl (1929/1973) returnedto Descartes’s “res extensa” and “res cogitans,” where thelatter remains in the domain of intentionality; Geertz’s(1973) difference between “emic” and “etic” in anthropol-ogy refers to whether the analyst takes both dimensions intoaccount by “understanding” or merely observes behavior;and, as noted, Giddens (1979) used “the duality of struc-ture,” but he did not wish to specify the operation of thisduality other than methodologically (e.g., Giddens, 1984, p.304; cf. Bryant & Jary, 1991, p. 96; Leydesdorff, 1993, p.54f).5 As noted earlier, Luhmann specified “structuration” interms of interaction, organization, and self-organization.

Luhmann also elaborated on Parsons’s notion of doublecontingency. Parsons (1968) distinguished two analyticallydifferent contingencies as specific for interhuman relations:one in terms of the res extensa and one in terms of what thecontingent other(s)—things, people—mean to us. Thesecond of these contingencies generates what he called a“double contingency” in the interhuman encounter: Egoexpects Alter to entertain expectations similar to Ego’s ownexpectations (Luhmann, 1995, pp. 103–136; Parsons, 1951,

5Giddens (1976, p. 162) also specified “double hermeneutics” as con-stitutive for the social sciences. However, the “double hermeneutics” refersspecifically to the two-way communication between scholarly and lay con-cepts in the social sciences whereas “double contingency” plays a role in allinterhuman communication because of the reflexivity involved.

10 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013DOI: 10.1002/asi

Page 11: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

p. 94; Parsons & Shils, 1951, p. 16; 1968). The sharing andexchanging of expectations in a double contingency opens“horizons of meaning” (Husserl, 1929/1973, 1935/1936/1962; Leydesdorff, 2012b).

In our opinion, the exchange of meaning remains anepiphenomenon that accompanies the exchange of informa-tion in human language or by way of symbolically general-ized media of communication for the nonverbal exchange asa second-order effect. However, the epiphenomenal level,while constructed and continuously reconstructed from thebottom up, can take control when sufficiently codified byproviding references to “horizons of meaning.” The feed-back also can then become a feed-forward to other possiblemeanings because redundancy (the number of options) canbe proliferated faster than can (Shannon-type) information,for example, in knowledge-based systems. The extent towhich this inversion happens remains an empirical question.The measurement of mutual redundancy makes such ques-tions amenable to empirical research (e.g., Leydesdorff &Strand, 2013).

Discussion

The definition of the mutual redundancy as uncertainty(or reducation of uncertainty), but equal in the absolute sizeto the mutual information, may seem to impose an arbitrarylimitation on the number of options that can be added to themaximum entropy in next-order loops of meaning process-ing. However, this limitation makes our reasoning consistentwith the mathematical theory of communication andprovides the mutual information in three dimensions withan interpretation whereas such an interpretation has beenunavailable thus far (despite the many empirical applicationsof the measure). However, the mutual information in threedimensions m*, was never Shannon-type information butmutual redundancy R123. The Shannon-type interactioninformation IABC→AB:AC:BC in three dimensions obeys a verydifferent calculus (Krippendorff, 1980, 2009a, 2009b).

The lower limit of mutual redundancy in two dimensions(R12) is zero if the mutual information is zero. This is unprob-lematic; but an upper limit of R12 = -T12 seems unnecessarilyconstraining on meaning processing while meanings maydevelop much faster in terms of reflexive options than can beretained historically. Wishes, expectations, and fantasies, forexample, proliferate faster than do actions (Weinstein & Platt,1969). However, this proliferation is within communicationsystems and not in terms of their operational relations.Mutual redundancy is only a part of the total redundancy ofinteracting communication systems.

Let us combine Luhmann’s (1995) and Simon’s (1973)conjectures that interhuman communications operate withan alphabet of functionally different codes. If N codes areused for the communication—instead of a single set ofcodes—mutual redundancies can be generated concurrentlyin all of these relations as parallel channels of the commu-nication. Already at the level of language (i.e., below theorder of symbolically generalized codes of communication),

the reflexive observer and the participant are able to changeroles, and thus mutual contingencies rapidly can be instan-tiated. The option to change repertoires in negotiations toenlarge the number of options is part of the communicativecompetencies of participants, for example, in the construc-tion of discourse coalitions (Hajer, Hoppe, Jennings,Fischer, & Forester, 1993; Levidow, 2009; Wagner &Wittrock, 1990).

Within each of Luhmann’s (1984/1995) “functionsystems” or Giddens’s (1979) “sets of rules and resources”(e.g., science, economy, or polity), symbolic generalizationof the code adds another degree of freedom to the processingof meaning. For example, whereas economists and engi-neers may provide different meanings and values to patents,their lawyers can be expected to use a very different ratio-nale when litigating in the court room. More options forproviding meaning are thus generated by functional differ-entiation of the codes of communication, and thus redun-dancy may proliferate more rapidly than only in terms ofmutual redundancies. These longer term developments overtime stand in orthogonal relation to the interaction informa-tion and mutual redundancy generated in instantiations (orevents) at specific moments in time.

Note that symbolic generalization also can be expected tohave a next-order effect on coding as an operation. Whereas(undifferentiated) meaning is first provided from the per-spective of hindsight—that is, “incursively” instead of“recursively”—symbolic generalization can lead to hyperin-cursivity. The calculus of meaning processing thus can beembedded in the theory and computation of anticipatorysystems (Dubois, 1998, 2003; Rosen, 1985), but such anextension would lead us beyond the scope of the presentstudy (cf. Leydesdorff, 2010b).

The message to remember is that mutual redundancy isnot total or maximum redundancy, just as mutual informa-tion is only part of a prevailing uncertainty. From this per-spective, the present study can be characterized as just a firststep toward a calculus of meaning; that is, the specificationof redundancies and uncertainties in the processing ofmeaning. Here, we elaborated mutual redundancy from asociological perspective—because the understanding of“meaning” is urgent at this level—but note that synergy alsocan be measured using Txyz(= Rxyz) at many other systemslevels.

For example, Maturana (1978) formulated the possibleemergence of a semantic domain at other levels as follows:

(I)f an organism is observed in its operation with a second-orderconsensual domain, it appears to the observer as if its nervoussystem interacted with internal representations of the circum-stances of its interactions, and as if the changes of state of theorganism were determined by the semantic value of these rep-resentations. (p. 49)

Maturana then argued that one should not in the biologicalcase be misled by the linguistic metaphor. Our specificresearch question in this study, however, was about the

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013 11DOI: 10.1002/asi

Page 12: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

generation of redundancy in interhuman communication(i.e., within a semantic domain where second-order repre-sentations function in the communication). The furtherextension to a biological (or other) system that also can beconsidered as a semantic domain is expected to require adifferent approach.

Conclusions and Summary

This study originated from the puzzle caused by thepossible change in sign when mutual information ismeasured in a next dimension. In the discussions aboutextending the number of helices beyond a triple helixof university—industry—government relations to a qua-druple, quintuple, or n-tuple set of helices, this problemhad to be urgently resolved (Carayannis & Campbell,2009, 2010; Leydesdorff, 2012a; Leydesdorff & Sun,2009; Kwon et al., 2012). We could solve the technicalpuzzle by considering “pure” sets; that is, without account-ing for the overlap as transmission but by adding “excess”entropy to the equation as additional redundancy. Theconcept of “mutual redundancy” then had to be providedwith an interpretation. We have argued that mutual infor-mation in three dimensions as defined and used in theliterature since its invention by McGill (1954) always hasbeen a measurement of mutual redundancy. If R123 > 0, thismutual redundancy has a negative effect on the uncertaintythat prevails.

When information is communicated between reflexive(sub)systems using different codes (i.e., the encoding anddecoding of the messages are performed with differentalgorithms), the received message can be given a variety ofmeanings, and thus the number of options is enlarged. Thetwo interacting systems then can be expected to generatemutual redundancy, but not automatically. Mutual redun-dancy is generated in relations. We propose to estimate thisuncertainty as equal to the mutual information—becauseone has no other information—but with a consistent (andtherefore sometimes opposite) sign. The correct sign (whenuncertainty is expressed in bits of information) followsfrom our definitions. The advantage of this proposal is itsconsistency with Shannon’s (1948) framework in whichprobabilistic entropy is always positive and increasing.Loops of communication thus far could not be appreciatedfrom this perspective. Furthermore, this extension of themathematical theory of communication solves the pro-blem of the alternating sign in mutual information in morethan two dimensions by clarifying this concept as mutualredundancy.

We provide a systems-theoretical appreciation in termsof Luhmann’s (1995) social systems theory and Giddens’s(1979) structuration theory. Luhmann’s scheme first distin-guished between human agents who consciously processmeaning and interhuman communication. These twosystems are structurally coupled, but unlike the biologicalconcept, operational closure is not achieved becausemeaning is provided reflexively, and thus mutual

redundancy also is generated in the operation of “interpen-etration” at the reflexive level (Luhmann, 1991, 2002).6

We have argued that Luhmann’s double structure of orga-nization and self-organization is not very different from thestructures of Giddens (1979, p. 71, 1984, p. 377) as different“sets of rule-resource relations” that are “instantiated”–andtherefore interfaced–in actions as events. If one changesGiddens’s “structuration of action” into the “structuration ofexpectations,” the two theories become virtually identical(Leydesdorff, 2010b). Reflexive (inter-)actions (by andamong human beings) then can be considered as specificways to organize expectations. It seems to us that our pro-posal precisely fills the duality between “absent” self-organization and “present” organization/instantiation interms that also can be measured in bits of information.

Aknowledgments

We are grateful to Wouter de Nooy, Rob Hagendijk,Ronald Rousseau, and two anonymous referees for com-ments on a previous version of this article.

References

Abramson, N. (1963). Information theory and coding. New York, NY:McGraw-Hill.

Ashby, W.R. (1969). Two tables of identities governing information flowswithin large systems. American Society for Cybernetics Communica-tions, 1(2), 3–8.

Bateson, G. (1972). Steps to an ecology of mind. New York, NY:Ballantine.

Bradshaw, G.L., & Lienert, M. (1991). The invention of the airplane. InProceedings of the 13th annual Conference of the Cognitive ScienceSociety (pp. 605–610).

Brillouin, L. (1962). Science and information theory. New York, NY:Academic Press.

Brooks, D.R., & Wiley, E.O. (1986). Evolution as entropy. Chicago, IL:University of Chicago Press.

Bryant, C.G.A., & Jary, D. (1991). Giddens’ theory of structuration: Acritical appreciation. London, England: Routledge.

Burt, R.S. (2001). Structural holes versus network closure as social capital.In N. Lin, K. Cook, & R.S. Burt (Eds.), Social capital: Theory andresearch (pp. 31–56). New Brunswick, NJ: Transaction.

Callon, M. (1998). An essay on framing and overflowing: Economic exter-nalities revisited by sociology. In M. Callon (Ed.), The laws of themarkets (pp. 244–269). Oxford, England: Blackwell.

Carayannis, E.G., & Campbell, D.F.J. (2009). “Mode 3” and “QuadrupleHelix:” Toward a 21st century fractal innovation ecosystem. Interna-tional Journal of Technology Management, 46(3), 201–234.

Carayannis, E.G., & Campbell, D.F.J. (2010). Triple helix, quadruple helixand quintuple helix and how do knowledge, innovation, and environmentrelate to each other? International Journal of Social Ecology and Sus-tainable Development, 1(1), 41–69.

Cassirer, E. (2010). Versuch über den Menschen: Einführung in einePhilosophie der Kultur (Vol. 23). Hamburg, Germany: Meiner Verlag.(Original work published 1923)

6Within the social system, such a duality between (internal) structuralcoupling (Luhmann, 1993, chap. 10) and operational coupling was alsospecified (Luhmann, 1995, p. 778) among the subsystems when function-ally differentiated in terms of the symbolic generalization of codes ofcommunication. We used above the coupling between the economic and thehealth system by a physician’s note following his own example.

12 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013DOI: 10.1002/asi

Page 13: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

Deacon, T.W. (1997). The symbolic species: The co-evolution of languageand the brain. New York, NY: Norton.

Deutsch, K.W. (1951). Mechanism, organism, and society: Some models innatural and social science. Philosophy of Science, 18(3), 230–252.

Dubois, D.M. (1998). Computing anticipatory systems with incursion andhyperincursion. In D.M. Dubois (Ed.), Computing anticipatory systems:CASYS. Proceedings of the American Institute of Physics First Interna-tional Conference (Vol. 437, pp. 3–29). Woodbury, NY: American Insti-tute of Physics.

Dubois, D.M. (2003). Mathematical foundations of discrete and functionalsystems with strong and weak anticipations. In M.V. Butz, O. Sigaud, &P. G&eacute;rard (Eds.), Lecture Notes in Artificial Intelligence: Vol.2684. Anticipatory behavior in adaptive learning systems (pp. 110–132).Berlin, Germany: Springer-Verlag.

Durrett, R. (s.d.). Mathematics behind numb3rs. Season 4, Episode 12:Power. Available at http://www.math.cornell.edu/~numb3rs/lipa/power.html

Etzkowitz, H., & Leydesdorff, L. (2000). The dynamics of innovation:From national systems and “Mode 2” to a triple helix of university–industry–government relations. Research Policy, 29(2), 109–123.

Garfinkel, H. (2008). Toward a sociological theory of information (editedand introduced by A.W. Rawls). Boulder, CO: Paradigm. (Original workpublished 1952)

Garner, W.R., & McGill, W.J. (1956). The relation between informationand variance analyses. Psychometrika, 21(3), 219–228.

Geertz, C. (1973). The interpretation of cultures. New York, NY: BasicBooks.

Giddens, A. (1976). New rules of sociological method. London, England:Hutchinson.

Giddens, A. (1979). Central problems in social theory. London, England:Macmillan.

Giddens, A. (1984). The constitution of society. Cambridge, England:Polity Press.

Hajer, M.A., Hoppe, R., Jennings, B., Fischer, F., & Forester, J. (1993). Theargumentative turn in policy analysis and planning. Durham, NC: DukeUniversity Press.

Hayles, N.K. (1990). Chaos bound: Orderly disorder in contemporaryliterature and science. Ithaca, NY: Cornell University.

Hodgson, G., & Knudsen, T. (2011). Darwin’s conjecture: The searchfor general principles of social and economic evolution. Chicago, IL:University of Chicago Press.

Horn, J., & Wilburn, D. (2005). The embodiment of learning. EducationalPhilosophy and Theory, 37(5), 745–760.

Husserl, E. (1962). Die Krisis der Europäischen Wissenschaften unddieTranszendentale Phänomenologie (The Crisis of European Sciencesand Transcendental Phenomenology). The Hague, The Netherlands:Martinus Nijhoff. (Original work published 1935/1936)

Husserl, E. (1973). Cartesianische Meditationen und Pariser Vorträge[Cartesian Meditations and the Paris Lectures; D. Cairns, Trans.].The Hague, The Netherlands: Martinus Nijhoff. Available at http://ia600601.us.archive.org/10/items/CartesiamMeditations/12813080-husserl-cartesian-meditations.pdf (Original work published 1929)

Jakulin, A. (2005). Machine learning based on attribute interactions.Ljubljana, Slovenia: University of Ljubljana. Available at http://stat.columbia.edu/~jakulin/Int/jakulin05phd.pdf

Krippendorff, K. (1980). Q; an interpretation of the information theoreticalQ-measures. In R. Trappl, G.J. Klir, & F. Pichler (Eds.), Progress incybernetics and systems research (Vol. VIII, pp. 63–67). New York, NY:Hemisphere.

Krippendorff, K. (1986). Information theory: Structural models for quali-tative data. Beverly Hills, CA: Sage.

Krippendorff, K. (2009a). W. Ross Ashby’s information theory: A bit ofhistory, some solutions to problems, and what we face today. Interna-tional Journal of General Systems, 38(2), 189–212.

Krippendorff, K. (2009b). Information of interactions in complex systems.International Journal of General Systems, 38(6), 669–680.

Kwon, K.S., Park, H.W., So, M., & Leydesdorff, L. (2012). Has globaliza-tion strengthened South Korea’s National Research System? National

and international dynamics of the triple helix of scientific co-authorshiprelationships in South Korea. Scientometrics, 90(1), 163–175.doi:10.1007/s11192-011-0512-9

Lengyel, B., & Leydesdorff, L. (2011). Regional innovation systems inHungary: The failing synergy at the national level. Regional Studies,45(5), 677–693.

Levidow, L. (2009). Democratizing agri-biotechnology? European publicparticipation in AgBiotech assessment. Comparative Sociology, 8(4),541–564.

Leydesdorff, L. (1993). “Structure”/“Action” contingencies and the modelof parallel processing. Journal for the Theory of Social Behaviour, 23(1),47–77.

Leydesdorff, L. (1994). Epilogue. In L. Leydesdorff & P.v.d. Besselaar(Eds.), Evolutionary economics and chaos theory: New directions fortechnology studies (pp. 180–192). London, England: Pinter.

Leydesdorff, L. (1995). The challenge of scientometrics: The development,measurement, and self-organization of scientific communications.Leiden, The Netherlands: DSWO Press, Leiden University.

Leydesdorff, L. (1997). Sustainable technological developments andsecond-order cybernetics. Technology Analysis & Strategic Manage-ment, 9(3), 329–341.

Leydesdorff, L. (2009). Interaction information: Linear and nonlinear inter-pretations. International Journal of General Systems, 38(6), 681–685.

Leydesdorff, L. (2010a). Redundancy in systems which entertain a modelof themselves: Interaction information and the self-organization ofanticipation. Entropy, 12(1), 63–79. doi:10.3390/e12010063

Leydesdorff, L. (2010b). The communication of meaning and the structu-ration of expectations: Giddens’ “structuration theory” and Luhmann’s“self-organization.” Journal of the American Society for InformationScience and Technology, 61(10), 2138–2150.

Leydesdorff, L. (2011). “Structuration” by intellectual organization: Theconfiguration of knowledge in relations among scientific texts. Sciento-metrics, 88(2), 499–520.

Leydesdorff, L. (2012a). The triple helix, quadruple helix, . . . , and ann-tuple of helices: Explanatory models for analyzing the knowledge-based economy? Journal of the Knowledge Economy, 3(1), 25–35.doi:10.1007/s13132-13011-10049-13134

Leydesdorff, L. (2012b). Radical constructivism and radical constructed-ness: Luhmann’s sociology of semantics, organizations, and self-organization. Constructivist Foundations, 8(1), 85–92.

Leydesdorff, L. (in press). Advances in science visualization: Socialnetworks, semantic maps, and discursive knowledge. In B. Cronin& C. Sugimoto (Eds.), Next generation metrics: Harnessing multi-dimensional indicators of scholarly performance. Cambridge MA: MITPress.

Leydesdorff, L., & Etzkowitz, H. (2003). Can “the public” be consideredas a fourth helix in university–industry–government relations? Reportof the Fourth Triple Helix Conference. Science & Public Policy, 30(1),55–61.

Leydesdorff, L., & Sun, Y. (2009). National and international dimensions ofthe triple helix in Japan: University–industry–government versus inter-national co-authorship relations. Journal of the American Society forInformation Science and Technology, 60(4), 778–788.

Leydesdorff, L., & Zawdie, G. (2010). The triple helix perspective ofinnovation systems. Technology Analysis & Strategic Management,22(7), 789–804.

Leydesdorff, L., & Strand, Ø. (2013). The Swedish system of innovation:Regional synergies in a knowledge-based economy. Journal of theAmerican Society for Information Science and Technology, 64(9), 1890–1902.

Luhmann, N. (1975). Interaktion, Organisation, Gesellschaft: Anwendun-gender Systemtheorie. In M. Gerhardt (Ed.), Die Zukunft der Philoso-phie (pp. 85–107). Munich, Germany: List.

Luhmann, N. (1984). Soziale Systeme. Grundrib einer allgemeinenTheorie. Frankfurt, Germany: Suhrkamp.

Luhmann, N. (1986a). The autopoiesis of social systems. In F. Geyer &J.v.d. Zouwen (Eds.), Sociocybernetic paradoxes (pp. 172–192). London,England: Sage.

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013 13DOI: 10.1002/asi

Page 14: Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

Luhmann, N. (1986b). Love as passion: The codification of intimacy. Stan-ford, CA: Stanford University Press.

Luhmann, N. (1986c). Intersubjektivität oder Kommunikation: Unter-schiedlicheAusgangspunkte soziologischer Theoriebildung. Archivio diFilosofia, 54(1–3), 41–60.

Luhmann, N. (1990). Meaning as Sociology’s Basic Concept. In N.Luhmann (Ed.), Essays on Self-Reference (pp. 21–79). New York /Oxford: Columbia University Press.

Luhmann, N. (1991). Die Form “Person.” Soziale Welt, 42(2), 166–175.Luhmann, N. (1993). Das Recht der Gesellschaft. Frankfurt, Germany:

Suhrkamp.Luhmann, N. (1995). Social systems. Stanford, CA: Stanford University

Press.Luhmann, N. (1997). Die Gesellschaft der Gesellschaft. Frankfurt a.M.:

Surhkamp.Luhmann, N. (2002). How can the mind participate in communication? In

W. Rasch (Ed.), Theories of distinction: Redescribing the descriptionsof modernity (pp. 169–184). Stanford, CA: Stanford University Press.

MacKay, D. M. (1969). Information, Mechanism and Meaning. Cambridgeand London: MIT Press.

Maturana, H.R. (1978). Biology of language: The epistemology of reality.In G.A. Miller & E. Lenneberg (Eds.), Psychology and biology oflanguage and thought. Essays in honor of Eric Lenneberg (pp. 27–63).New York, NY: Academic Press.

Maturana, H.R., & Varela, F. (1980). Autopoiesis and cognition: The real-ization of the living. Boston, MA: Reidel.

McGill, W.J. (1954). Multivariate information transmission. Psy-chometrika, 19(2), 97–116.

Mead, G.H. (1934). The point of view of social behaviorism. In C.H. Morris(Ed.), Mind, self, and society from the standpoint of a social behaviorist.Works of G.H. Mead (Vol. 1, pp. 1–41). Chicago, IL: University ofChicago Press.

Merton, R.K. (1957). Social theory and social structure (Rev. ed.). Glencoe,IL: Free Press.

Miller, G.A. (1951). Language and communication. New York, NY:McGraw-Hill.

Mitroff, I.I. (1974). The subjective side of science. Amsterdam, TheNetherlands: Elsevier.

Parsons, T. (1951). The social system. New York, NY: Free Press.Parsons, T. (1963a). On the concept of political power. Proceedings of the

American Philosophical Society, 107(3), 232–262.Parsons, T. (1963b). On the concept of influence. Public Opinion Quarterly,

27(Spring), 37–62.Parsons, T. (1968). Interaction: I. Social interaction. In D.L. Sills (Ed.), The

International Encyclopedia of the Social Sciences (Vol. 7, pp. 429–441).New York, NY: McGraw-Hill.

Parsons, T., & Shils, E.A. (1951). Toward a general theory of action. NewYork, NY: Harper & Row.

Rosen, R. (1985). Anticipatory systems: Philosophical, mathematical andmethodological foundations. Oxford, England: Pergamon Press.

Rousseau, R. (1998). Venn, Caroll, Karnaugh en Edwards. Wiskunde &Onderwijs, 24(95), 233–241.

Ruesch, J., & Bateson, G. (1951). Communication: The social matrix ofpsychiatry. New York, NY: Norton.

Shannon, C.E. (1948). A mathematical theory of communication. BellSystem Technical Journal, 27, 379–423, 623–656.

Shannon, C.E., & Weaver, W. (1949). The mathematical theory of commu-nication. Urbana: University of Illinois Press.

Simon, H.A. (1962). The architecture of complexity. Proceedings of theAmerican Philosophical Society, 106(6), 467–482.

Simon, H.A. (1973). The organization of complex systems. In H.H. Pattee(Ed.), Hierarchy theory: The challenge of complex systems (pp. 1–27).New York, NY: Braziller.

Strand, Ø., & Leydesdorff, L. (2013). Where is synergy in the Norwegianinnovation system indicated? Triple helix relations among technology,organization, and geography. Technological Forecasting and SocialChange, 80(3), 471–484.

Theil, H. (1972). Statistical decomposition analysis. Amsterdam,The Netherlands: North-Holland.

Sun, Y., & Negishi, M. (2010). Measuring the relationships among univer-sity, industry and other sectors in Japan’s national innovation system:A comparison of new approaches with mutual information indicators.Scientometrics, 82(3), 677–685.

Ulanowicz, R.E. (1986). Growth and development: Ecosystems phenom-enology. San José, CA: toExcel Press.

Ulanowicz, R.E. (2009). A third window: Natural life beyond Newton andDarwin. Wesst Conshohocken, PA: Templeton Press.

Varela, F.J. (1979). Principles of biological autonomy. Amsterdam, TheNetherlands: North-Holland.

von Neumann, J., & Morgenstern, O. (1944). Theory of games andeconomic behavior. Princeton, NJ: Princeton University Press.

Wagner, P., & Wittrock, B. (1990). States, institutions, and discourses: Acomparative perspective on the structuration of the social sciences. In P.Wagner, B. Wittrock, & R.D. Whitley (Eds.), Discourses on society.Sociology of the sciences yearbook (Vol. 15, pp. 331–357). Dordrecht,The Netherlands: Springer.

Weaver, W. (1949). Some recent contributions to the mathematical theoryof communication. In C.E. Shannon & W. Weaver (Eds.), The math-ematical theory of communication (pp. 1–28). Urbana: University ofIllinois Press.

Weinstein, F., & Platt, G.M. (1969). The wish to be free: Society, psyche,and value change. Berkeley: University of California Press.

Wicken, J.S. (1989). Evolution and thermodynamics: The new paradigm.Systems Research, 6(3), 181–186.

Willett, K., & Zwick, M. (2004). A software architecture for reconstructa-bility analysis. Kybernetes, 33(5/6), 997–1008.

Ye, F.Y., Yu, S.S., & Leydesdorff, L. (in press). The triple helix ofuniversity–industry–government relations at the country level, and itsdynamic evolution under the pressures of globalization. Journal of theAmerican Society for Information Science and Technology. Available athttp://arxiv.org/abs/1209.1260

Yeung, R.W. (2008). Information theory and network coding. New York,NY: Springer.

14 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—•• 2013DOI: 10.1002/asi