Off-Loading Memory to the Environment: A Quantitative Example

3
Off-Loading Memory to the Environment: A Quantitative Example JOHN CASE Department of Computer and Information Sciences, University of Delaware, Newark, DE 19716, USA; E-mail: [email protected] Abstract. R.W. Ashby maintained that people and animals do not have to remember as much as one might think since considerable information is stored in the environment. Presented herein is an everyday, quantitative example featuring calculation of the number bits of memory that can be off- loaded to the environment. The example involves one’s storing directions to a friend’s house. It is also argued that the example works with or without acceptance of the extended mind hypothesis. Additionally, a brief supporting argument for at least a form of this hypothesis is presented. Key words: bits, brain, environment, extended mind hypothesis, mathematics 1. Background R.W. Ashby (who wrote Ashby, 1956, 1960) pointed out many times in a graduate Cybernetics course I took from him in the late 1960s that people (and other an- imals) do not have to remember as much as one might think since considerable information is stored in the environment. There is related discussion about the memory load on ants compared to that on their environment in Simon (1981). There is further somewhat related discussion in Hutchins (1995) about cognition more generally. In particular, Hutchins (1995) focuses on cognition, including memory, for navigation on a naval ship, and maintains it is distributed among the human personnel, navigation equipment, and the environment external to the ship. 2. The Quantitative Example Here is, then, an everyday, quantitative example featuring calculation of the num- ber bits of memory that can be off-loaded to the environment. Typically (correct) directions, along a series of roads, to, say, a friend’s house will provide an initial direction together with a sequence with entries such as and then you come to the Exxon station at which you should take a right. Suppose the initial direction is fixed. Suppose each landmark is unique (e.g. there is at most one Exxon station) and the path is loop-free. Then, for such sequences having n landmarks with corresponding binary choices (take a right or a left), one need remember only a mapping from landmarks to {right, left} . There are Minds and Machines 14: 387–389, 2004. © 2004 Kluwer Academic Publishers. Printed in the Netherlands.

Transcript of Off-Loading Memory to the Environment: A Quantitative Example

Off-Loading Memory to the Environment:A Quantitative Example

JOHN CASEDepartment of Computer and Information Sciences, University of Delaware, Newark, DE 19716,USA; E-mail: [email protected]

Abstract. R.W. Ashby maintained that people and animals do not have to remember as much asone might think since considerable information is stored in the environment. Presented herein is aneveryday, quantitative example featuring calculation of the number bits of memory that can be off-loaded to the environment. The example involves one’s storing directions to a friend’s house. It isalso argued that the example works with or without acceptance of the extended mind hypothesis.Additionally, a brief supporting argument for at least a form of this hypothesis is presented.

Key words: bits, brain, environment, extended mind hypothesis, mathematics

1. Background

R.W. Ashby (who wrote Ashby, 1956, 1960) pointed out many times in a graduateCybernetics course I took from him in the late 1960s that people (and other an-imals) do not have to remember as much as one might think since considerableinformation is stored in the environment. There is related discussion about thememory load on ants compared to that on their environment in Simon (1981). Thereis further somewhat related discussion in Hutchins (1995) about cognition moregenerally. In particular, Hutchins (1995) focuses on cognition, including memory,for navigation on a naval ship, and maintains it is distributed among the humanpersonnel, navigation equipment, and the environment external to the ship.

2. The Quantitative Example

Here is, then, an everyday, quantitative example featuring calculation of the num-ber bits of memory that can be off-loaded to the environment.

Typically (correct) directions, along a series of roads, to, say, a friend’s housewill provide an initial direction together with a sequence with entries such as

and then you come to the Exxon station at which you should take a right.

Suppose the initial direction is fixed. Suppose each landmark is unique (e.g. thereis at most one Exxon station) and the path is loop-free. Then, for such sequenceshaving n landmarks with corresponding binary choices (take a right or a left),one need remember only a mapping from landmarks to {right, left}. There are

Minds and Machines 14: 387–389, 2004.© 2004 Kluwer Academic Publishers. Printed in the Netherlands.

388 JOHN CASE

only 2n such mappings, so the information needing to be stored is a mere n bits.However, by a straightforward counting argument, the set of all such sequential di-rections numbers n! × 2n. Remembering any one of those, then, takes log2(n!)+ n

bits.1 One needs to remember only n of these bits; hence, in effect, the environmentstores, among other things, the residual log2(n!)= (by formula (3.18), on p. 55 ofCormen et al. (2001)) �(n log2(n)) bits.2

Therefore, in my example, a not-so-savvy agent may store a whole non-repeatingsequence of n landmarks each labeled with right or left. A savvy agent storesonly one of the mappings from landmarks to {right, left} – which requires onlyn bits. In effect, the savvy agent places the memory load of the �(n log2(n)) bits,that the not-so-savvy agent might have additionally stored, on the environment.These extra bits which might be, but need not be stored by the agent are, in anycase, stored in the environment. They are the bits for storing the sequence of theentries. Of course, the environment contains even more, for example, the wholespatial layout of the landmarks.3

3. Relation to the Extended Mind Hypothesis

In this section I will first argue that my above quantitative example is independentof the extended mind hypothesis which allows flexibility concerning the conceptualplacement of the boundaries of mind/cognition (including memory). Then (as abonus so to speak) I will briefly argue for a form of the extended mind hypothesisanyhow, a form in which I talk about cognition (and not about mind).

In an example from Clark and Chalmers (1998), Inge remembers in and recallsfrom Inge’s brain where the museum is, but Otto, an Alzheimer patient, records inand recalls from his notebook where it is. For my quantitative example above, eachof Inge and Otto with his notebook would need to store only n-bits. Inge might doit in her brain and Otto with his notebook in Otto’s notebook. Inge or Otto withhis notebook might be less savvy and store an additional �(n log2(n)) bits, Ingein Inge’s brain and Otto with his notebook in Otto’s notebook. My quantitativeexample works whether the bits are stored in a brain or a notebook.

Of course if one disallows ones concept of cognition, including memory, to spillover into such devices as Otto’s notebook and forces it to be confined to the brain,then the team of Otto and his notebook is not doing cognitive memory. However,it seems to me human conceptualization involves mental cutting of nature at its(apparent) joints – for useful purposes, including for thought and communication.Imagine, if you will, a futuristic world containing an Otto with progressively moreadvanced Alzheimer’s disease and with his notebook replaced by successivelymore intelligent robots culminating in a highly intelligent robot with legal rightsand a job involving its carrying Otto around to visit friends he no longer remembersbut may still enjoy. Conceptually, it becomes progressively more useful to excludeOtto mostly or entirely from the boundaries of the navigation cognition requiredand to ascribe most or all of that cognition to the robot. At least, in between, when

OFF-LOADING MEMORY TO THE ENVIRONMENT 389

Otto is still cognitively contributing, it would seem useful to conceptualize Ottoand the robot as a team (or part of a team) with the relevant cognition distributedbetween them.4

Acknowledgement

The author received support from NSF grant CCR-0208616.

Notes1It is useful to reckon memory size needed in a way that remains valid in all contexts. This is typically done bycalculating the number of bits (0’s and 1’s) required. For example, the standard character set called extendedascii has exactly 256 characters. The number of bits minimally required to store any one character of these256 characters is, then, log2(256), which = 8 bits, that is, one byte of memory. This number of bits crucially,minimally enables each possible extended ascii character to have a different code in memory from each othersuch character. If we only ever wanted to store the single character b, then the number of bits required wouldbe log2(1), which = zero bits. It is only when memory has to accommodate the possibility of storing any one ofmore than one possible thing that more than zero bits are required.Regarding the non-repeating sequences of n landmarks with each landmark labeled with right or left, thereare n! × 2n possibilities. Here is why. The non-repeating sequences of n landmarks can be counted as follows.There are n choices for the first landmark, (n − 1) for the second, . . ., and 1 for the last. This yields a total ofn! possible non-repeating landmark sequences. Each landmark in each such sequence is labeled with one of twopossibilities: right or left. Hence, there are n! × 2n total possible non-repeating sequences of n landmarkswith each landmark labeled with right or left. Similarly to the above analysis for extended ascii, we have,then, the number of bits required to store any one of the n! × 2n possible sequential directions is log2(n! × 2n)

which, by standard logarithm identities, = log2(n!) + n bits.2From p. 42 of Cormen et al. (2001), for functions f, g mapping, say, positive integers to non-negative reals,

f(n) = �(g(n))def⇔ (∃ positive constants c1, c2 and a positive integer n0) (∀n� n0) [0 � c1g(n) � f (n) �

c2g(n)], that is, informally, f (n) = �(g(n))def⇔, modulo positive multiplicative constants, g(n) provides an

asymptotically tight upper and lower bound on f (n). Hence, modulo positive multiplicative constants, n log2(n)

provides an asymptotically tight upper and lower bound on log2(n!). Of course, n log2(n) has an easier tounderstand growth than does log2(n!).3Note that environments, like other memory media, are malleable, and, then, may store different things at differenttimes.4The concept of mind typically includes some reflective component, but the concept of cognition need not. Thequantitative arguments herein are independent of whether a reflective component is included, but the argumentcomparing Inge and Otto with his notebook or robot does not deal with a reflective component of mind orcognition.

References

Ashby, R. (1956), An Introduction to Cybernetics, New York: Wiley.Ashby, R. (1960), Design for a Brain: The Origin of Adaptive Behavior, second edition, New York:

Wiley.Clark, A. and Chalmers, D. (1998), ‘The Extended Mind’, Analysis 58, pp. 10–23. Reprinted in

P. Grim, ed. (1998), The Philosopher’s Annual, p. XXI.Cormen, T., Leiserson, C., Rivest, R. and Stein, C. (2001), Introduction to Algorithms, second edition,

Cambridge, MA: MIT Press.Hutchins, E. (1995), Cognition in the Wild, Cambridge, MA: MIT Press.Simon, H. (1981), The Sciences of the Artificial, second edition, Cambridge, MA: MIT Press.