Algorithmic Information Theory and the Emergence of Order Entropy and replication Sean Devine...
-
Upload
wesley-heath -
Category
Documents
-
view
224 -
download
0
Transcript of Algorithmic Information Theory and the Emergence of Order Entropy and replication Sean Devine...
Algorithmic Information Theory and Algorithmic Information Theory and the Emergence of Orderthe Emergence of Order
Entropy and replicationEntropy and replication
Sean DevineSean Devine
victoria management school
victoria
management school
The Universe and OrderThe Universe and Order
This talk makes two pointsThis talk makes two points1.1. Replication is a major ordering Replication is a major ordering
process like crystallisationprocess like crystallisation E.g. where dn/dt ~ nE.g. where dn/dt ~ nxx, replicates will , replicates will
growgrow
2.2. Algorithmic Entropy can be used to Algorithmic Entropy can be used to quantify orderquantify order
Including systems with noise and Including systems with noise and variationvariation
victoria
management school
Algorithmic entropy or Algorithmic entropy or algorithmic complexityalgorithmic complexity
Algorithmic Entropy =Algorithmic Entropy = Length of the shortest algorithm that Length of the shortest algorithm that
generates the string defining a generates the string defining a structure or configurationstructure or configuration
Using simple binary UTM, denoted by UUsing simple binary UTM, denoted by U
HHUU(s) = minimum |p| such that U(p)=s(s) = minimum |p| such that U(p)=s
HHalgoalgo(s) (s) ≤ ≤ HHUU(s) + O(1)(s) + O(1) self delimiting algorithmself delimiting algorithm
– Kraft inequality holdsKraft inequality holds
victoria
management school
Relationship other Relationship other entropiesentropies
For all strings in an For all strings in an equilibrium equilibrium configuration, configuration, HHalgoalgo(s) = Shannon entropy (ignoring (s) = Shannon entropy (ignoring
overheads)overheads) Algo entropy of string = captures uncertaintyAlgo entropy of string = captures uncertainty
= k= kBBln2 ln2 HHalgoalgo(s) (s) = Boltzmann-Gibbs = Boltzmann-Gibbs entropiesentropies
Meaningful and consistent for off-Meaningful and consistent for off-equilibrium configurationsequilibrium configurations
victoria
management school
Algorithmic Information Algorithmic Information Theory (AIT) and EntropyTheory (AIT) and Entropy
AIT developed by Kolmogorov, AIT developed by Kolmogorov, Levin and independently ChaitinLevin and independently Chaitin
Developments not readily accessible to Developments not readily accessible to scientistsscientists
Zurek 1Zurek 1stst to seriously apply to to seriously apply to physicsphysics
victoria
management school
Ordered string can be Ordered string can be compressedcompressed
s = “s = “111…..111111…..111” ; i.e. N 1’s, can be ” ; i.e. N 1’s, can be generated by:generated by:
p =PRINT “1” N timesp =PRINT “1” N times HHalgoalgo(s) ~ log(s) ~ log22N + logN + log22loglog22NN
– Ignoring Print statement for large NIgnoring Print statement for large N– Second term is cost of self delimiting algorithmsSecond term is cost of self delimiting algorithms
Disordered or Random string Disordered or Random string incompressibleincompressible
s = “110111..1100…11”s = “110111..1100…11” of length N of length N HHalgoalgo > length of string > length of string
victoria
management school
Order = low algorithmic Order = low algorithmic entropyentropy
Order is rare –most strings are randomOrder is rare –most strings are random Cannot determine whether s Cannot determine whether s
compressiblecompressible– Consequence of GConsequence of Göödel and Turingdel and Turing– But if we perceive order string is compressibleBut if we perceive order string is compressible
victoria
management school
Common algorithmic Common algorithmic instructions taken as giveninstructions taken as given
Entropy is a state function- only Entropy is a state function- only difference has meaning.difference has meaning. Physical laws, machine dependence, Physical laws, machine dependence,
phase space graining can be absorbed phase space graining can be absorbed into the common instructions.into the common instructions.
p= p= xxxxxxxxxxxxxxxxxxxxxxxx::yyyyyyyyyyyyyyyyyy…………..…………..
I.e. string p* + I.e. string p* + physical laws etcphysical laws etc..
HHalgoalgo(s) can be taken to be |p*|(s) can be taken to be |p*|
victoria
management school
Provisional EntropyProvisional Entropy
Provisional entropy makes meaningful for Provisional entropy makes meaningful for noisy descriptionsnoisy descriptions HHalgoalgo (Set) specifies the set of all possible noisy (Set) specifies the set of all possible noisy
strings consistent with a pattern or a model.strings consistent with a pattern or a model. Given the set, HGiven the set, Halgoalgo (string in set) specifies (string in set) specifies
particular string in the setparticular string in the set HHprovprov = H = Halgoalgo (Set) + H (Set) + Halgoalgo (string in set) (string in set)
Provisional because hidden pattern might Provisional because hidden pattern might existexist
Cf Algorithmic Minimum Sufficient Statistic of Cf Algorithmic Minimum Sufficient Statistic of KolmogorovKolmogorov
victoria
management school
Algorithm to define context i.e. model or pattern
Hs = log2N H (specifies which string) = log2N
Shannon and Provisional Shannon and Provisional EntropyEntropy
+
No information on context
SHANNON ENTROPY PROVISIONAL ENTROPY
victoria
management school
Algorithmic entropy and Algorithmic entropy and physical lawsphysical laws
Real world computation defines Real world computation defines system trajectory of systemsystem trajectory of system
a chemical reactiona chemical reaction DNA replicationDNA replication Function like a UTMFunction like a UTM
HHalgoalgo ≤ | system’s internal algorithm|≤ | system’s internal algorithm| Discarded information makes irreversibleDiscarded information makes irreversible
– Cost kCost kBBloglogee2 per bit discarded ; i.e. k2 per bit discarded ; i.e. kBBTlogTlogee2 Joules2 Joules
– Landauer, BennettLandauer, Bennett
victoria
management school
Algorithmic EntropyAlgorithmic Entropy
Defines entropy of actual Defines entropy of actual configurationconfiguration
Applies to non equilibrium Applies to non equilibrium situationssituations
Provides the thermodynamic cost Provides the thermodynamic cost of discarding entropyof discarding entropy E.g. Cost of recyclingE.g. Cost of recycling
Cost of non equilibrium existenceCost of non equilibrium existence
victoria
management school
Set of replicates has low Algo Set of replicates has low Algo EntropyEntropyp = p = “Repeat replicate N “Repeat replicate N times”times”
Example - two state Example - two state atomic laseratomic laser i = 11111…111i = 11111…111, ,
representing excited representing excited atomic statesatomic states
No photon statesNo photon states Ignore momentum Ignore momentum
states as constantstates as constant
HHalgoalgo ( (ii) low =ordered) low =ordered
1 1 1 1 1 1
0 1 0 0 1 1 0
1
1 1 x y
victoria
management school
Free expansion trajectoryFree expansion trajectory
ff = 1100110.. = 1100110.. atomic statesatomic states + 11xy1 photon + 11xy1 photon
statesstates (1= coherent, x (1= coherent, x
=incoherent)=incoherent)
At equilibrium- At equilibrium- photons are photons are absorbed and absorbed and emittedemitted
1 1 x y 1
1 0 0 1 1 01
victoria
management school
The computationThe computation
HHalgoalgo((ff) more disordered) more disordered Atomic states longer descriptionAtomic states longer description Coherent photon states short descriptionCoherent photon states short description
Repeat photon N timesRepeat photon N times Incoherent photon states randomIncoherent photon states random
xyzxx…xyzxx… Like a free expansionLike a free expansion
But replication compensates for disorderingBut replication compensates for disordering Would seem to be underlying principle Would seem to be underlying principle
victoria
management school
Generalisation e.g. N spinsGeneralisation e.g. N spins
i = xxxxx, (spins) i = xxxxx, (spins) 00000000000000 (= low T sink). (= low T sink). HHprovprov(i) ~ N + |N| (i) ~ N + |N|
f = 111111 (spins) f = 111111 (spins) xxxxx..xxxxxxxx..xxx (sink T (sink T rises)rises) xxxxx..xxxxxxxx..xxx discarded as latent heat discarded as latent heat
HHprovprov(f) (f) ~ |N|~ |N| Disorder of sink states no longer in descriptionDisorder of sink states no longer in description
Irreversibility = ejecting disorderIrreversibility = ejecting disorder
victoria
management school
Provisional entropy measures Provisional entropy measures variation in replicatesvariation in replicates
s= “1111…..11111”s= “1111…..11111” HHprovprov ~ log ~ log22N/2 + |11| (i.e. =logN/2 + |11| (i.e. =log22N)N)
S = “1y1y1y…..1y1y1y” where 1y S = “1y1y1y…..1y1y1y” where 1y is a variation of 11is a variation of 11 HHprovprov ~ N/2 +log ~ N/2 +log22N/2 + N/2 + |0| +|1| |0| +|1|
– 22N/2 N/2 members in setmembers in set
Entropy change = N/2 Entropy change = N/2 = increase in uncertainty= increase in uncertainty
victoria
management school
System State Space System State Space TrajectoryTrajectory
1.1. Initial growth of Initial growth of replicatesreplicates
2.2. At saturation- At saturation- replicates die and are replicates die and are bornborn
3.3. When entropy When entropy ejected, system locks ejected, system locks into an attractor like into an attractor like region region
Births = deaths of Births = deaths of replicatesreplicates
If not isolated- If not isolated- Homeostasis requires Homeostasis requires
replicates regenerationreplicates regeneration
First replicationEntropy passed to
environment
Attractor for dynamical system where replicates are
highly probable.
victoria
management school
Attractor-like behaviour off Attractor-like behaviour off equilibriumequilibrium
Resource flows needed to Resource flows needed to regenerate replicatesregenerate replicates E.g. pump laser to replenish photonsE.g. pump laser to replenish photons
Variation in replicates stablises Variation in replicates stablises systemsystem External impacts restrict size of External impacts restrict size of
attractor-like regionattractor-like region Shape changes – may merge with another Shape changes – may merge with another
replicate setreplicate set
victoria
management school
Coupling of ReplicatorsCoupling of Replicators
Replicators that pass resources Replicators that pass resources (entropy) to each other are(entropy) to each other are More likely as more resource efficientMore likely as more resource efficient Less cost to be maintained off Less cost to be maintained off
equilibriumequilibrium E.g. one laser system pumping anotherE.g. one laser system pumping another
victoria
management school
Nesting Systems reduces Nesting Systems reduces algorithmic entropyalgorithmic entropy
Nested system orders Nested system orders at different scalesat different scales As described by nested As described by nested
algorithms, Halgorithms, Halgo algo lowlow But if large scale But if large scale
ordering lostordering lost Algo entropy increasesAlgo entropy increases
At smallest scale no At smallest scale no order observedorder observed Cf algorithm that Cf algorithm that
defines me withdefines me with Algorithms that see Algorithms that see
me as a pile of atoms.me as a pile of atoms.
d2
d3
d0
d1
victoria
management school
d Diameter complexityd Diameter complexity
Reducing scale Reducing scale suppresses order; i.e. suppresses order; i.e. longer descriptionlonger description
Variation increases Variation increases entropy (dotted line); butentropy (dotted line); but
Nesting decreases entropy Nesting decreases entropy to compensateto compensate
DDorg org = H= Hmaxmax(x)-H(x)-Hd0d0(x)(x) Software variation is more Software variation is more
efficient algorithmically as efficient algorithmically as scale lowscale low
victoria
management school
Universe evolution and 2Universe evolution and 2ndnd lawlaw
Universe is in an initial stateUniverse is in an initial state Trajectory determined by an algorithmTrajectory determined by an algorithm
p=For step 0 to t; p=For step 0 to t; compute next state;compute next state;
next step.next step. If physical laws are simpleIf physical laws are simple
|p| ~ l|p| ~ logog22tt Equilibrium when Equilibrium when loglog22t’ >>logt’ >>log22tt
victoria
management school
What does it all meanWhat does it all mean Have a practical entropy measureHave a practical entropy measure
Tool for measuring changeTool for measuring change Shows how replication counters entropy increaseShows how replication counters entropy increase
Nested structures are highly orderedNested structures are highly ordered Nesting counters entropy increase from Nesting counters entropy increase from
variations variations Minimises entropy cost of adaptationMinimises entropy cost of adaptation
Maybe replication maintains order as Maybe replication maintains order as the universe trajectory is towards the universe trajectory is towards disorder.disorder.
victoria
management school
ReferencesReferences Kolmogorov, K. Three approaches to the quantitative
definition of information. Prob. Info. Trans. 1965, 1, 1-7. Levin, L. A. Zvonkin. The complexity of finite objects and
the development of the concepts of information and randomness by means of the theory of algorithms. Russ. Math. Survs. 1970, 25, 83-124
Chaitin, G. On the length of programs for computing finite binary sequences. J. ACM 1966, 13, 547-569.
Zurek W. H. Algorithmic randomness and physical entropy. Physical Review A 1989, 40, 4731-4751.
Bennett, C. H. Thermodynamics of Computation- A review. International Journal of Theoretical Physics 1982, 21, 905-940.
Landauer, R. Irreversibility and heat generation in the computing process, IBM Journal of Research and Development 5, 183-191, (1961).