Physics and Metaphysics - CTMU

26
All material at this website © 1998-2005 by Christopher Michael Langan Flash from Deep Space: Supernews on Supernovas At a January, 1998 meeting of the American Astronomical Society, it was announced that a survey of distant supernovae by an international team of astronomers indicates that cosmic expansion is being accelerated by an unknown repulsive force…a force that appears strong enough to prevent gravity from collapsing the universe. The supernovae are of an extensively studied variety known as type Ia. Type Ia supernovae, which result from the explosion of certain white dwarf stars, are so similar and steady in luminosity that they can be used as standard candles, or reliable measures of astral brightness. Because they always shine with the same intrinsic intensity, it is possible to infer their distances by the dimness of their light. Distance can then be plotted against speed of recession as calculated from frequency reductions due to cosmic redshift. Such a plot turns out to be increasingly nonlinear as distance increases, suggesting an unexpected acceleration. Most of those analyzing the data agree on the implications. Because the observed supernovae are dimmer than would be expected had the rate of cosmic expansion remained constant since their parent stars exploded, expansion must have accelerated since the supernovae were born, and some kind of accelerative force must pervade the universe. As early as February, 1998, an independent group calling itself the High-z Supernova Search Team (z means redshift) confirmed this finding with a statistical confidence of 98.7-99.99%. Since then, further observations have provided yet more confirmation. Among the candidates for a repulsive force are Einstein’s cosmological constant, representing a form of antigravity associated with General Relativity, and quantum mechanical vacuum energy, which produces outward pressure through the spontaneous creation and annihilation of virtual particle-antiparticle pairs. Also receiving welcome air time are more bizarre ideas like X- matterand quintessence, strange forms of background energy that might be produced by

description

In a composite object (like a brain) consisting of multiple parts, the dual aspects of infocognition become crowded together in spacetime. But in the quantum realm, this “monic duality” takes the form of an alternation basic to the evolution of spacetime itself. This alternation usually goes by the name of wave-particle duality, and refers to the inner-expansive and collapsative phases of the quantum wave function. Where ripples represent the expansive (or cognitive) phase, and their collapsation into new events determines the informational phase, the above reasoning can be expressed as follows: as the infocognitive universe evolves, the absolute rate of spatiotemporal cognition cn at time n, as measured in absolute (conserved) units of spacetime, is inversely proportional to the absolute information density Rn/R0 of typical physical systems...i.e., to the concentration of locally-processed physical information. As light slows down, more SCSPL-grammatical (generalized cognitive) steps are performed per unit of absolute distance traversed. So with respect to meaningful content, the universe remains steady in the process of self-creation.http://www.megafoundation.org/CTMU/Articles/Supernova.html

Transcript of Physics and Metaphysics - CTMU

Page 1: Physics and Metaphysics - CTMU

  All material at this website © 1998-2005 by Christopher Michael Langan

 

 

 

 

 

Flash from Deep Space: Supernews on Supernovas     

At a January, 1998 meeting of the American Astronomical Society, it was announced that a survey of distant supernovae by an international team of astronomers indicates that cosmic expansion is being accelerated by an unknown repulsive force…a force that appears strong enough to prevent gravity from collapsing the universe. 

The supernovae are of an extensively studied variety known as type Ia.  Type Ia supernovae, which result from the explosion of certain white dwarf stars, are so similar and steady in luminosity that they can be used as standard candles, or reliable measures of astral brightness. Because they always shine with the same intrinsic intensity, it is possible to infer their distances by the dimness of their light.  Distance can then be plotted against speed of recession as calculated from frequency reductions due to cosmic redshift.  Such a plot turns out to be increasingly nonlinear as distance increases, suggesting an unexpected acceleration. 

Most of those analyzing the data agree on the implications.  Because the observed supernovae are dimmer than would be expected had the rate of cosmic expansion remained constant since their parent stars exploded, expansion must have accelerated since the supernovae were born, and some kind of accelerative force must pervade the universe.  As early as February, 1998, an independent group calling itself the High-z Supernova Search Team (z means redshift) confirmed this finding with a statistical confidence of 98.7-99.99%.  Since then, further observations have provided yet more confirmation. 

Among the candidates for a repulsive force are Einstein’s cosmological constant, representing a form of antigravity associated with General Relativity, and quantum mechanical vacuum energy, which produces outward pressure through the spontaneous creation and annihilation of virtual particle-antiparticle pairs.  Also receiving welcome air time are more bizarre ideas like X-matterand quintessence, strange forms of background energy that might be produced by mysterious physical processes as yet unidentified.

In the minds of most theorists, the best bet is a combination of cosmological constant and vacuum energy.  That would deliver the steady effect that has been observed, as opposed to the fluctuations that might accompany X-matter and quintessence.  Unfortunately, none of the experts can give a constructive reason for why any of these influences should themselves exist. No conventional theory incorporating the Standard Big Bang Model of cosmology can explain why an expansive force should be evident or how it might have originated.

However, an alternative theory does exist.  It is virtually unknown to most physicists, whose stock repertoire is limited to theories that have been published by other physicists in exclusive scholastic journals unreceptive to strange new ideas emanating from unknown, academically uncredentialed sources.  It has, however, been published for a highly intelligent readership in a journal called Noesis.  This theory is called the CTMU.  Although Noesis is not, strictly speaking, a

Page 2: Physics and Metaphysics - CTMU

peer-reviewed journal, the CTMU has been extensively criticized and defended within it over the last decade on the basis of published descriptions.

The CTMU is not an ordinary physical theory.  Instead of being just another system of ad hocequations purporting to describe some limited aspect of the physical universe, it has a unique logical structure designed to embed the very foundations of mathematics and philosophy.  Yet, unlike many predominantly rationalistic theories, it is designed to resolve paradox on both rational and observational levels, and thus makes verifiable statements regarding the nature of observable reality.  Some of these statements, which range in application from quantum physics to cosmology, relate to cosmic expansion.

To query the universe regarding its true nature is to ask a very deep question.  There was always a strange and touching mixture of humility and hubris in the idea that physicists could obtain an answer to this question simply by looking through their instruments and carefully describing what they saw.  After all, reality contains features like mind, cognition andconsciousness that do not lend themselves to empirical techniques or scientific reductionism. Yet, they are basic, unavoidable ingredients of every scientific measurement ever performed and every theoretical explanation ever devised.  Without conscious minds subjecting reality to cognition, science could not exist.

The CTMU provides physics and cosmology with a logical framework that incorporates these missing ingredients in a way reflecting the nature of their involvement in measurement and theorization.  And as a bonus, it does what no other theory can: while painlessly absorbing the bulk of modern physical science and resolving many of its most stubborn and destructive paradoxes, it coherently explains the cosmological implications of the evidence for accelerative recession of type Ia supernovae in the context of a self-contained, self-creating universe.

 

Physics and Metaphysics  (© 1998 - 2002 by C.M. Langan)

 

Today : Metaphysics :: Tomorrow : Physics

Today’s dominant theory of small-scale physics, quantum mechanics, did not begin its long and successful run as a physical theory.  The reason is logical; its major premise, the Heisenberg Uncertainty Principle, sets absolute limits on the accuracy to which quanta can be measured, and cursory logical analysis reveals that this defines a relation between measurer and measured object that cannot be expressed in a language describing measured objects alone.  Since classical physics was the latter kind of language, neither the uncertainty principle nor quantum mechanics could be immediately classified as “physics”.  Rather, they belonged to a logical metalanguage of physics called metaphysics.  Indeed, even at a time when physics routinely explains what the ancients would have seen as “magic”, some physicists view quantum mechanics with a touch of uneasy skepticism.  The reason: it raises too many metaphysical-looking issues without satisfactorily resolving them.

Relativity too was initially a metaphysical theory based on the formation of higher-order predicates, spacetime and spacetime curvature, that had not existed in physics, drawing in the interests of self-containment a higher-order causal relationship between the fundamental physical parameters space, time and matter on a combination of empirical and mathematical

Page 3: Physics and Metaphysics - CTMU

grounds (a higher-order relation is a relation of relations…of relations of primitive objects, defined at the appropriate level of predicate logic).  Since this describes a semantic operation that cannot be effected within the bare language of physics as it existed at the time, relativity was metaphysical rather than physical in nature.  Nevertheless, it achieved quick recognition as a physical theory…not only because Einstein was already recognized as a professional physicist, but because it made physical predictions that classical physics alone did not make. 

It was recognized long before Einstein that observations of physical objects vary subjectively in certain parameters.  For example, although objects are identical when viewed under identical conditions, they vary in size when viewed from different distances, display different shapes from different angles, and seem to be differently colored and shaded under different lighting conditions.  Einstein expanded the range of subjective variation of physical phenomena by showing that objects also look different when viewed at different relative velocities.  But in keeping with classical objectivism, he showed in the process that such perceptual variations were a matter of objective circumstance, in effect treating perception itself as an objective phenomenon.  Because this kind of empirical objectivization is exactly what is expected of the objective empirical science of physics, attention was diverted from nagging metaphysical questions involving the interplay of rational and empirical factors in perception. 

Although he never got around to enunciating it, Einstein may well have sensed that perception cannot be understood without understanding the logic of this interplay, and that this logic is instrumental to the inherently metaphysical operation of theoretical unification.  Nevertheless, perhaps encouraged by his own apparent success in sidestepping this particular metaphysical issue, he spent the second half of his career on a doomed attempt to unify physics in a purely physical context - that is, in the context of a spacetime model which went only as far as Relativity Theory.  Since then, many bright and well-educated people have repeated roughly the same error, never realizing that physical theory truly advances only by absorbing profoundly creative metaphysical extensions on which no ordinary physicist would wittingly sign off.

Like quantum mechanics and the Theory of Relativity, the CTMU is a metaphysical theory that makes distinctive predictions and retrodictions not made by previous theories.  However, the CTMU makes no attempt in the process to sidestep difficult metaphysical issues.  For example, Einstein introduced the cosmological constant to stabilize the size of the universe, but then dropped it on the empirical grounds of apparent universal expansion.  In contrast, the CTMU goes beyond empiricism to the rational machinery of perception itself, providing cosmic expansion with a logical basis while predicting and explaining some of its features.  Relativity pursues the goal of explanatory self-containment up to a point; spacetime contains matter and energy that cause spacetime fluctuations that cause changes in matter and energy.  The CTMU, on the other hand, pursues the goal of self-containment all the way up to cosmogenesis.  And while neither GR nor QM does anything to resolve the fatal paradoxes of ex nihilo creation and quantum nonlocality, the CTMU dissolves such paradoxes with a degree of logical rectitude to which science seldom aspires. 

To understand how the CTMU is a natural extension of modern physics, let us review the history of the physicist’s and cosmologist’s art in the context of Cartesian coordinate systems. 

 

The Cartesian Architecture of Split-Level Reality

Modern physics is based on, and can be described as the evolution of, rectilinear coordinate systems.  Called Cartesian coordinate systems after their leading exponent, René Descartes,

Page 4: Physics and Metaphysics - CTMU

they are particularly well-suited to the objective modeling of physical phenomena, that is, to the algebraic representation of static and dynamic relationships without respect to their locations or the special vantages from which they are observed or considered.  This property of Cartesian spaces relates directly to another invention of Monsieur Descartes called mind-body dualism, which merely states in plain language what Cartesian coordinate systems seem to graphically depict: namely, that cognition and physical reality can be factored apart at our philosophical and scientific convenience.

Since the time of Descartes, there have been numerous attempts to clarify the exact relationship between mind and reality and thereby solve the "mind-body problem".  Hume, for example, held that reality consists exclusively of sense impressions from which concepts like “mind” and “matter” are artificially abstracted.  Kant countered with the view that the mind knows deep reality only through cryptic impressions molded to its own cognitive categories. More recently, Lewes’ dual-aspect monism maintained that reality consists of a neutral substance of which mind and body are just complementary aspects, while the mind-stuff theoryof Clifford and Prince was a form of psychical monism positing that mind is the ultimate reality, that ordinary material reality is simply mind apprehended by mind, and that the higher functions of mind emerge from smaller pieces of mind that do not of themselves possess higher mental attributes (an idea previously attributable to Leibniz, Spencer and others).

But while each of these theories contains a part of the truth, none contains the whole truth. Only recently have the parts been assembled in light of modern logic and science to create a dynamic, comprehensive theory of reality that solves the mind-body problem for good. 

 

Classical Mechanics: The Infinitesimal Worm in Newton’s Apple

The next big representational advance in physics, Newtonian mechanics, relied on a new kind of analytic geometry based on vector modeling and vector analysis of physical motion in Cartesian coordinate systems in which space and time are represented as independent (orthogonal) axes. Its Langrangian and Hamiltonian formulations occupy the same mathematical setting.  From the beginning, the featureless neutrality of Cartesian space accorded well with the evident nonpreference of physical laws for special coordinate systems…i.e., for the homogeneity and isotropy of space and time.  However, this property of Cartesian spaces also rendered position and motion completely relative in the Galilean sense.  Finding that this made it hard to explain certain physical phenomena - e.g., inertia, and later on, electromagnetic wave propagation - Newton and his followers embraced the concept of a stationary aether against whose background all dimensions, durations and velocities were considered to be fixed.  The aether was an unusual kind of “field” over which the center of mass of the universe was perfectly distributed, explaining not only inertia but the evident fact that all of the matter in the universe was not gravitating toward a central point in space.

In order to work with his vector model of physical reality, Newton had to invent a new kind of mathematical analysis known as infinitesimal calculus.  Specifically, he was forced in calculating the velocity of an object at each point of its trajectory to define its “instantaneous rate of change” as a ratio of "infinitesimal" vectors whose lengths were “smaller than any finite quantity, but greater than zero”.  Since this is a paradoxical description that does not meaningfully translate into an actual value – the only number usually considered to be smaller than any finite quantity is 0 - it is hard to regard infinitesimal vectors as meaningful objects.  But even though they could not give a satisfactory account of infinitesimal vectors, post-Newtonian physicists at least knew where such vectors were located: in the same Cartesian space containing

Page 5: Physics and Metaphysics - CTMU

ordinaryfinite vectors.  Better yet, the celebrated mathematician Gauss discovered that they could be confined within the curves to which they were tangential, an insight developed by Riemann into a comprehensive theory of differential geometry valid for curved surfaces in any number of dimensions.  However, although differential geometry would later prove useful in formulating a generalization of Cartesian space, its “intrinsic” nature did not resolve the paradox of infinitesimals.

For many years after Newton, mathematicians struggled to find a way around the infinitesimals paradox, first lighting on the Weierstrass epsilon-delta formalism purporting to characterize infinitesimals within standard Cartesian space.  But later – in fact, years after Newtonian mechanics was superseded by a more general theory of physics – they finally satisfied their yearning to understand infinitesimals as timeless mathematical objects.  They accomplished this by reposing them in a hypothetical nonstandard universe where each point of the standard universe is surrounded by a nonstandard neighborhood containing infinitesimal objects (in logic, a theory is justified by demonstrating the existence of a model for which it is true; as a model of infinitesimals and their relationships with finite numbers, the nonstandard universe justifies the infinitesimal calculus).  Ignoring the obvious fact that this could have a metaphysical bearing on physical reality, mathematicians took the nonstandard universe and carried it off in the purely abstract direction of nonstandard analysis.

 

Quantum Mechanics: Space and Time Get Smeared (and Worse)

After standing for over two centuries as the last word in physics, the differential equations comprising the deterministic laws of Newtonian mechanics began to run into problems.  One of these problems was called the Heisenberg Uncertainty Principle or HUP.  The HUP has the effect of “blurring” space and time on very small scales by making it impossible to simultaneously measure with accuracy certain pairs of attributes of a particle of matter or packet of energy. Because of this blurring, Newton’s differential equations are insufficient to describe small-scale interactions of matter and energy.  Therefore, in order to adapt the equations of classical mechanics to the nondeterministic, dualistic (wave-versus-particle) nature of matter and energy, the more or less ad hoc theory of quantum mechanics (QM) was hastily developed.  QM identifies matter quanta with “probability waves” existing in -dimensional complex Hilbert space, a Cartesian space defined over the field of complex numbers a+bi (where a and b are real numbers and i = 1) instead of the pure real numbers, and replaces Hamilton’s classical equations of motion with Schrodinger’s wave equation.  QM spelled the beginning of the end for Laplacian determinism, a philosophical outgrowth of Newtonianism which held that any temporal state of the universe could be fully predicted from a complete Cartesian description of any other.  Not only uncertainty but freedom had reentered the physical arena

Unfortunately, the HUP was not the only quantum-mechanical problem for classical physics and its offshoots.  Even worse was a phenomenon called EPR (Einstein-Podolsky-Rosen) nonlocality, according to which the conservation of certain physical quantities for pairs of correlated particles seems to require that information be instantaneously transmitted between them regardless of their distance from each other.  The EPR paradox juxtaposes nonlocality to the conventional dynamical scenario in which anything transmitted between locations must move through a finite sequence of intervening positions in space and time.  So basic is this scenario to the classical worldview that EPR nonlocality seems to hang over it like a Damoclean sword, poised to sunder it like a melon.  Not only does no standard physical theory incorporating common notions of realism, induction and locality contain a resolution of this paradox - this much we know from a mathematical result called Bell's theorem - but it seems that the very foundations of physical

Page 6: Physics and Metaphysics - CTMU

science must give way before a resolution can even be attempted!

 

The Special Theory of Relativity: Space and Time Beget Spacetime

Another problem for Newton’s worldview was the invariance of c, the speed of light in vacuo. Emerging from Maxwell’s equations and direct experimentation (e.g., Michelson-Morley), c-invariance defies representation in an ordinary Cartesian vectorspace.  Einstein’s Special Theory of Relativity (SR), arising at about the same time as quantum mechanics, was designed to fill the breach.  To accomplish this, Einstein had to generalize Cartesian space in such a way that distances and durations vary with respect to distinct coordinate systems associated with various states of relative motion.  The particular generalization of Cartesian space in which these motion-dependent coordinate systems are related, and in which the velocity of light can be invariantly depicted, is called Minkowski spacetime.  In spacetime, space and time axes remain perpendicular but no longer represent independent dimensions.  Instead, spacelike and timelike domains are separated by null or lightlike geodesics (minimal trajectories) representing the "paths" traced by light, and the constant velocity of light is represented by the constant (usually 45°) orientation of the corresponding spacetime vectors.  The space and time axes of moving coordinate systems are skewed by the Lorentz transformation functions according to their relative angles of propagation (velocities) through timelike domains, resulting in relative distortions of space and time vectors between systems.

 

General Relativity: Spacetime Develops Curvature

Flat or Euclidean spacetime suffices for the representation of all kinds of physical motion up to constant linear acceleration.  However, gravity – which Newton had regarded as a force and represented as an ordinary Cartesian vector – causes other kinds of motion as well, e.g. orbital motion.  So in order to generalize Minkowski spacetime to explain gravity, Einstein had to undertake a further generalization of Cartesian space accommodating non-flat or curved spacetime.  In this generalization of Cartesian space, spacetime curvature is represented by algebraically well-behaved generalizations of vectors called tensors, which are just mathematical functions that take ordinary spacetime vectors as input and yield other vectors (or numbers) as output.  Calculating these entities can be as exacting and tedious as counting sand grains, but they are mathematically straightforward.  By modeling physical reality as a curved tensor manifold, Einstein was able to explain how gravity affects motion and thus to create the gravitational extension of Special Relativity known as General Relativity or GR.  While the gravitational calculations of GR match those of classical mechanics under most conditions, experiment favors GR decisively in certain situations.

Although GR is based on differential geometry intrinsic to curved surfaces – geometry in which one need never leave a surface in order to determine its properties – distances and curvatures are ultimately expressed in terms of minute spatiotemporal vectors or "line elements" which must be made infinitesimal to ensure that they never leave the "surface" of spacetime.  Thus, GR avoids the mensural necessity of an external hyperspace only by inheriting Newton’s infinitesimals paradox.  Since GR, like classical mechanics, treats these line elements as objects to be used in forming ratios and tensors, it requires an "object-oriented" (as opposed to a Weierstrass-style procedural) definition of infinitesimals.  But such a definition requires a nonstandard universe on model-theoretic grounds.  So GR depends on a nonstandard universe as much as its predecessors, and is not as self-contained as the concept of intrinsic curvature might

Page 7: Physics and Metaphysics - CTMU

lead one to expect.  

Strictly speaking, Newtonian mechanics and all subsequent theories of physics require a nonstandard universe, i.e. a model that supports the existence of infinitesimals, for their formulation.  The effect of this requirement is to blur the distinction between physics, which purports to limit itself to the standard universe of measurable distances, and metaphysics, which can describe the standard universe as embedded in a higher-dimensional space or a nonstandard universe containing infinitesimals.  The fast acceptance of GR as a "physical" theory thus owed at least in part to the fact that physicists had already learned to swallow the infinitesimals paradox every time they used the infinitesimal calculus to do classical mechanics!  Only much later, with the advent of an infocognitive spacetime internally accommodating necessary metaphysical self-extensions, could the infinitesimal line elements of GR be properly embedded in a neoCartesian model of reality as abstract ingredients of certain distributed computations (seeAppendix A).

The moral of the story up to this point is abundantly clear: both before and after Newton, the greatest advances in physics have come through the creation and absorption of metaphysical extensions.  Unfortunately, most physicists are sufficiently unclear on this fact that the word “metaphysics” remains all but unmentionable in their deliberations.

But change finds a way.

 

The Search for a Unified Field: Spacetime Gains New Dimensions

Having found so satisfying a mathematical setting for gravitation, Einstein next tried to create aUnified Field Theory (UFT) by using the same model to explain all of the fundamental forces of nature, two of which, electricity and magnetism, had already been unified by Maxwell aselectromagnetism or EM for short  As a natural first step, Einstein tried to formulate the EM force as a tensor.  Unfortunately, EM force is governed by quantum mechanics, and 4-dimensional GR tensors lack intrinsic quantum properties.  This alone limits General Relativity to a calculational rather than explanatory bearing on electromagnetism.  Because the branch of quantum mechanics called quantum electrodynamics (QED), which treats the EM force as a particle interchange, better explained the properties and dynamics of electrons and electromagnetic fields, Einstein’s geometrodynamic approach to the UFT was widely abandoned.

After a time, however, the trek was resumed.  Kaluza-Klein theory had already added an extra dimension to spacetime and curled it up into a tight little loop of radius equal to the Planck length (10-33 cm, far smaller than any known particle).  This curling maneuver explained more than the extra dimension’s invisibility; because only a discrete number of waves can fit around a loop, it also seemed to explain why particle energies are quantized, and thus to provide a connection between relativity theory and QM (in 1921, Kaluza observed that light could be viewed as the product of fifth-dimensional vibrations).  Though temporarily forgotten, this idea was ultimately revived in connection with supersymmetry, an attempt to unify the fundamental forces of nature in a single theory by defining GR-style tensors accommodating 7 additional spacetime dimensions.  Shortened and rolled up in bizarre topological configurations, these dimensions would exhibit fundamental frequencies and quantized harmonics resembling the quantum properties of tiny particles of matter.

Although supersymmetry was eventually dropped because its 11-dimensional structure failed to explain subatomic chirality (whereby nature distinguishes between right- and left-handedness),

Page 8: Physics and Metaphysics - CTMU

its basic premises lived on in the form of 10-dimensional superstring theory.  Again, the basic idea was to add additional dimensions to GR, slice and splice these extra dimensions in such a way that they manifest the basic features of quantum mechanics, and develop the implications in the context of a series of Big Bang phase transitions (“broken symmetries”) in which matter changes form as the hot early universe cools down (mathematically, these phase transitions are represented by the arrows in the series GH…SU(3) x SU(2) x U(1)SU(3) x U(1), where alphanumerics represent algebraic symmetry groups describing the behavioral regularities of different kinds of matter under the influence of different forces, and gravity is mysteriously missing) 

Unfortunately, just as General Relativity did nothing to explain the origin of 4-D spacetime or its evident propensity to “expand” when there would seem to be nothing for it to expand into, string theory did nothing to explain the origin or meaning of the n-dimensional strings into which spacetime had evolved.  Nor did it even uniquely or manageably characterize higher-dimensional spacetime structure; it required the same kind of nonstandard universe that was missing from GR in order to properly formulate quantum-scale dimensional curling, and eventually broke down into five (5) incompatible versions all relying on difficult and ill-connected kinds of mathematics that made even the simplest calculations, and extracting even the most basic physical predictions, virtually impossible.  Worse yet, it was an unstratified low-order theory too weak to accommodate an explanation for quantum nonlocality or measurable cosmic expansion. 

Recently, string theory has been absorbed by a jury-rigged patchwork called “membrane theory” or M-theory whose basic entity is a p-dimensional object called, one might almost suspect eponymically, a “p-brane” (no, this is not a joke).  P-branes display mathematical properties called S- and T-duality which combine in a yet-higher-level duality called the Duality of Dualities(again, this is not a joke) that suggests a reciprocity between particle size and energy that could eventually link the largest and smallest scales of the universe, and thus realize the dream of uniting large-scale physics (GR) with small-scale physics (QM).  In some respects, this is a promising insight; it applies broad logical properties of theories (e.g., duality) to what the theories “objectively” describe, thus linking reality in a deeper way to the mental process of theorization.  At the same time, the “membranes” or “bubbles” that replace strings in this theory more readily lend themselves to certain constructive interpretations.   

But in other ways, M-theory is just the same old lemon with a new coat of paint.  Whether the basic objects of such theories are called strings, p-branes or bubble-branes, they lack sufficient structure and context to explain their own origins or cosmological implications, and are utterly helpless to resolve physical and cosmological paradoxes like quantum nonlocality and ex nihilo(something-from-nothing) cosmogony… paradoxes next to which the paradoxes of broken symmetry “resolved” by such theories resemble the unsightly warts on the nose of a charging rhinoceros.  In short, such entities sometimes tend to look to those unschooled in their virtues like mathematical physics run wildly and expensively amok. 

Alas, the truth is somewhat worse.  Although physics has reached the point at which it can no longer credibly deny the importance of metaphysical criteria, it resists further metaphysical extension. Instead of acknowledging and dealing straightforwardly with its metaphysical dimension, it mislabels metaphysical issues as “scientific” issues and festoons them with increasingly arcane kinds of mathematics that hide its confusion regarding the underlying logic. Like a social climber determined to conceal her bohemian roots, it pretends that it is still dealing directly with observable reality instead of brachiating up vertiginous rationalistic tightropes towards abstractions that, while sometimes indirectly confirmed, are no more directly observable than fairies and unicorns.  And as science willfully distracts itself from the urgent

Page 9: Physics and Metaphysics - CTMU

metaphysical questions it most needs to answer, philosophy ignores its parental responsibility.

 

Reality as a Cellular Automaton: Spacetime Trades Curves for Computation

At the dawn of the computer era, the scientific mainstream sprouted a timely alternative viewpoint in the form of the Cellular Automaton Model of the Universe, which we hereby abbreviate as the CAMU.  First suggested by mathematician John von Neumann and later resurrected by salesman and computer scientist Ed Fredkin, the CAMU represents a conceptual regression of spacetime in which space and time are re-separated and described in the context of a cellular automaton.  Concisely, space is represented by (e.g.) a rectilinear array of computational cells, and time by a perfectly distributed state transformation rule uniformly governing cellular behavior.  Because automata and computational procedures are inherently quantized, this leads to a natural quantization of space and time.  Yet another apparent benefit of the CAMU is that if it can be made equivalent to a universal computer, then by definition it can realistically simulate anything that a consistent and continually evolving physical theory might call for, at least on the scale of its own universality. 

But the CAMU, which many complexity theorists and their sympathizers in the physics community have taken quite seriously, places problematic constraints on universality.  E.g., it is not universal on all computational scales, does not allow for subjective cognition except as an emergent property of its (assumedly objective) dynamic, and turns out to be an unmitigated failure when it comes to accounting for relativistic phenomena.  Moreover, it cannot account for the origin of its own cellular array and is therefore severely handicapped from the standpoint of cosmology, which seeks to explain not only the composition but the origin of the universe. Although the CAMU array can internally accommodate the simulations of many physical observables, thus allowing the CAMU’s proponents to intriguingly describe the universe as a “self-simulation”, its inability to simulate the array itself precludes the adequate representation of higher-order physical predicates with a self-referential dimension.

 

Reality as Reality Theory: Spacetime Turns Introspective

Now let us backtrack to the first part of this history, the part in which René Descartes physically objectivized Cartesian spaces in keeping with his thesis of mind-body duality.  Notice that all of the above models sustain the mind-body distinction to the extent that cognition is regarded as an incidental side effect or irrelevant epiphenomenon of objective laws; cognition is secondary even where space and time are considered non-independent.  Yet not only is any theory meaningless in the absence of cognition, but the all-important theories of relativity and quantum mechanics, without benefit of explicit logical justification, both invoke higher-level constraints which determine the form or content of dynamical entities according to properties not of their own, but of entities that measure or interact with them.  Because these higher-level constraints are cognitive in a generalized sense, GR and QM require a joint theoretical framework in which generalized cognition is a distributed feature of reality.

Let’s try to see this another way.  In the standard objectivist view, the universe gives rise to atheorist who gives rise to a theory of the universe.  Thus, while the universe creates the theory by way of a theorist, it is not beholden to the possibly mistaken theory that results.  But while this is true as far as it goes, it cannot account for how the universe itself is created.  To fill this gap, the CTMU Metaphysical Autology Principle or MAP states that because reality is an all-

Page 10: Physics and Metaphysics - CTMU

inclusive relation bound by a universal quantifier whose scope is unlimited up to relevance, there is nothing external to reality with sufficient relevance to have formed it; hence, the real universe must be self-configuring.  And the Mind-Equals-Reality (M=R) Principle says that because the universe alone can provide the plan or syntax of its own self-creation, it is an "infocognitive" entity loosely analogous to a theorist in the process of introspective analysis.  Unfortunately, since objectivist theories contain no room for these basic aspects of reality, they lack the expressive power to fully satisfy relativistic, cosmological or quantum-mechanical criteria.  The ubiquity of this shortcoming reflects the absence of a necessary and fundamental logical feature of physical analysis, a higher order of theorization in which theory cognitively distributes over theory, for which no conventional theory satisfactorily accounts.    

In view of the vicious paradoxes to which this failing has led, it is only natural to ask whether there exists a generalization of spacetime that contains the missing self-referential dimension of physics.  The answer, of course, is that one must exist, and any generalization that is comprehensive in an explanatory sense must explain why.  In Noesis/ECE 139, the SCSPL paradigm of the CTMU was described to just this level of detail.  Space and time were respectively identified as generalizations of information and cognition, and spacetime was described as a homogeneous self-referential medium called infocognition that evolves in a process called conspansion.  Conspansive spacetime is defined to incorporate the fundamental concepts of GR and QM in a simple and direct way that effectively preempts the paradoxes left unresolved by either theory alone.  Conspansive spacetime not only incorporates non-independent space and time axes, but logically absorbs the cognitive processes of the theorist regarding it.  Since this includes any kind of theorist cognitively addressing any aspect of reality, scientific or otherwise, the CTMU offers an additional benefit of great promise to scientists and nonscientists alike: it naturally conduces to a unification of scientific and nonscientific (e.g. humanistic, artistic and religious) thought.

 

CTMU >> CAMU in Camo

Before we explore the conspansive SCSPL model in more detail, it is worthwhile to note that the CTMU can be regarded as a generalization of the major computation-theoretic current in physics, the CAMU.  Originally called the Computation-Theoretic Model of the Universe, the CTMU was initially defined on a hierarchical nesting of universal computers, the Nested Simulation Tableauor NeST, which tentatively described spacetime as stratified virtual reality in order to resolve a decision-theoretic paradox put forth by Los Alamos physicist William Newcomb (see Noesis 44, etc.).  Newcomb’s paradox is essentially a paradox of reverse causality with strong implications for the existence of free will, and thus has deep ramifications regarding the nature of time in self-configuring or self-creating systems of the kind that MAP shows it must be.  Concisely, it permits reality to freely create itself from within by using its own structure, without benefit of any outside agency residing in any external domain.

Although the CTMU subjects NeST to metalogical constraints not discussed in connection with Newcomb’s Paradox, NeST-style computational stratification is essential to the structure of conspansive spacetime.  The CTMU thus absorbs the greatest strengths of the CAMU – those attending quantized distributed computation – without absorbing its a priori constraints on scale or sacrificing the invaluable legacy of Relativity.  That is, because the extended CTMU definition of spacetime incorporates a self-referential, self-distributed, self-scaling universal automaton, the tensors of GR and its many-dimensional offshoots can exist within its computational matrix.

An important detail must be noted regarding the distinction between the CAMU and CTMU.  By its

Page 11: Physics and Metaphysics - CTMU

nature, the CTMU replaces ordinary mechanical computation with what might better be calledprotocomputation.  Whereas computation is a process defined with respect to a specific machine model, e.g. a Turing machine, protocomputation is logically "pre-mechanical".  That is, before computation can occur, there must (in principle) be a physically realizable machine to host it.  But in discussing the origins of the physical universe, the prior existence of a physical machine cannot be assumed.  Instead, we must consider a process capable of giving rise to physical reality itself...a process capable of not only implementing a computational syntax, but of serving as its own computational syntax by self-filtration from a realm of syntactic potential.  When the word "computation" appears in the CTMU, it is usually to protocomputation that reference is being made.    

It is at this point that the theory of languages becomes indispensable.  In the theory of computation, a "language" is anything fed to and processed by a computer; thus, if we imagine that reality is in certain respects like a computer simulation, it is a language.  But where no computer exists (because there is not yet a universe in which it can exist), there is no "hardware" to process the language, or for that matter the metalanguage simulating the creation of hardware and language themselves.  So with respect to the origin of the universe, language and hardware must somehow emerge as one; instead of engaging in a chicken-or-egg regress involving their recursive relationship, we must consider a self-contained, dual-aspect entity functioning simultaneously as both.  By definition, this entity is a Self-Configuring Self-Processing Language or SCSPL.  Whereas ordinary computation involves a language, protocomputation involves SCSPL.  

Protocomputation has a projective character consistent with the SCSPL paradigm.  Just as all possible formations in a language - the set of all possible strings - can be generated from a single distributed syntax, and all grammatical transformations of a given string can be generated from a single copy thereof, all predicates involving a common syntactic component are generated from the integral component itself.  Rather than saying that the common component is distributed over many values of some differential predicate - e.g., that some distributed feature of programming is distributed over many processors - we can say (to some extent equivalently) that many values of the differential predicate - e.g. spatial location - are internally orendomorphically projected within the common component, with respect to which they are "in superposition".  After all, difference or multiplicity is a logical relation, and logical relations possess logical coherence or unity; where the relation has logical priority over the reland, unity has priority over multiplicity.  So instead of putting multiplicity before unity and pluralism ahead of monism, CTMU protocomputation, under the mandate of a third CTMU principle called Multiplex Unity or MU, puts the horse sensibly ahead of the cart.

To return to one of the central themes of this article, SCSPL and protocomputation are metaphysical concepts.  Physics is unnecessary to explain them, but they are necessary to explain physics.  So again, what we are describing here is a metaphysical extension of the language of physics.  Without such an extension linking the physical universe to the ontological substrate from which it springs - explaining what physical reality is, where it came from, and how and why it exists - the explanatory regress of physical science would ultimately lead to the inexplicable and thus to the meaningless. 

        

Spacetime Requantization and the Cosmological Constant

The CTMU, and to a lesser extent GR itself, posits certain limitations on exterior measurement. GR utilizes (so-called) intrinsic spacetime curvature in order to avoid the necessity

Page 12: Physics and Metaphysics - CTMU

of explaining an external metaphysical domain from which spacetime can be measured, while MAP simply states, in a more sophisticated way consistent with infocognitive spacetime structure as prescribed by M=R and MU, that this is a matter of logical necessity (see Noesis/ECE 139, pp. 3-10).  Concisely, if there were such an exterior domain, then it would be an autologous extrapolation of the Human Cognitive Syntax (HCS) that should properly be included in the spacetime to be measured.  [As previously explained, the HCS, a synopsis of the most general theoretical language available to the human mind (cognition), is a supertautological formulation of reality as recognized by the HCS.  Where CTMU spacetime consists of HCS infocognition distributed over itself in a way isomorphic to NeST – i.e., of a stratified NeST computer whose levels have infocognitive HCS structure – the HCS spans the laws of mind and nature.  If something cannot be mapped to HCS categories by acts of cognition, perception or reference, then it is HCS-unrecognizable and excluded from HCS reality due to nonhomomorphism; conversely, if it can be mapped to the HCS in a physically-relevant way, then it is real and must be explained by reality theory.] 

Accordingly, the universe as a whole must be treated as a static domain whose self and contents cannot “expand”, but only seem to expand because they are undergoing internal rescaling as a function of SCSPL grammar.  The universe is not actually expanding in any absolute, externally-measurable sense; rather, its contents are shrinking relative to it, and to maintain local geometric and dynamical consistency, it appears to expand relative to them. Already introduced as conspansion (contraction qua expansion), this process reduces physical change to a form of "grammatical substitution" in which the geometrodynamic state of a spatial relation is differentially expressed within an ambient cognitive image of its previous state. By running this scenario backwards and regressing through time, we eventually arrive at the source of geometrodynamic and quantum-theoretic reality: a primeval conspansive domain consisting of pure physical potential embodied in the self-distributed "infocognitive syntax" of the physical universe…i.e., the laws of physics, which in turn reside in the more general HCS.

Conspansion consists of two complementary processes, requantization and inner expansion. Requantization downsizes the content of Planck’s constant by applying a quantized scaling factor to successive layers of space corresponding to levels of distributed parallel computation.  Thisinverse scaling factor 1/R is just the reciprocal of the cosmological scaling factor R, the ratio of the current apparent size dn(U) of the expanding universe to its original (Higgs condensation) size d0(U)=1.  Meanwhile, inner expansion outwardly distributes the images of past events at the speed of light within progressively-requantized layers.  As layers are rescaled, the rate of inner expansion, and the speed and wavelength of light, change with respect to d0(U) so that relationships among basic physical processes do not change…i.e., so as to effect nomological covariance.  The thrust is to relativize space and time measurements so that spatial relations have different diameters and rates of diametric change from different spacetime vantages. This merely continues a long tradition in physics; just as Galileo relativized motion and Einstein relativized distances and durations to explain gravity, this is a relativization for conspansive “antigravity” (see Appendix B).

Conspansion is not just a physical operation, but a logical one as well.  Because physical objects unambiguously maintain their identities and physical properties as spacetime evolves, spacetime must directly obey the rules of 2VL (2-valued logic distinguishing what is true from what is false).  Spacetime evolution can thus be straightforwardly depicted by Venn diagrams in which the truth attribute, a high-order metapredicate of any physical predicate, corresponds to topological inclusion in a spatial domain corresponding to specific physical attributes.  I.e., to be true, an effect must be not only logically but topologically contained by the cause; to inherit properties determined by an antecedent event, objects involved in consequent events must

Page 13: Physics and Metaphysics - CTMU

appear within its logical and spatiotemporal image.  In short, logic equals spacetime topology.

This 2VL rule, which governs the relationship between the Space-Time-Object and Logico-Mathematical subsyntaxes of the HCS, follows from the dual relationship between set theory and semantics, whereby predicating membership in a set corresponds to attributing a propertydefined on or defining the set.  The property is a “qualitative space” topologically containing that to which it is logically attributed.  Since the laws of nature could not apply if the sets that contain their arguments and the properties that serve as their parameters were not mutually present at the place and time of application, and since QM blurs the point of application into a region of distributive spacetime potential, events governed by natural laws must occur within a region of spacetime over which their parameters are distributed.

Conspansive domains interpenetrate against the background of past events at the inner expansion rate c, defined as the maximum ratio of distance to duration by the current scaling, and recollapse through quantum interaction.  Conspansion thus defines a kind of “absolute time” metering and safeguarding causality.  Interpenetration of conspansive domains, which involves a special logical operation called unisection (distributed intersection) combining aspects of the set-theoretic operations union and intersection, creates an infocognitive relation of sufficiently high order to effect quantum collapse.  Output is selectively determined by ESP interference and reinforcement within and among metrical layers.

Because self-configurative spacetime grammar is conspansive by necessity, the universe is necessarily subject to a requantizative “accelerative force” that causes its apparent expansion. The force in question, which Einstein symbolized by the cosmological constant lambda, is all but inexplicable in any nonconspansive model; that no such model can cogently explain it is why he later relented and described lambda as “the greatest blunder of his career”.  By contrast, the CTMU requires it as a necessary mechanism of SCSPL grammar.  Thus, recent experimental evidence – in particular, recently-acquired data on high-redshift Type Ia supernovae that seem to imply the existence of such a force – may be regarded as powerful (if still tentative) empirical confirmation of the CTMU.

 

Metrical Layering

In a conspansive universe, the spacetime metric undergoes constant rescaling.  Whereas Einstein required a generalization of Cartesian space embodying higher-order geometric properties like spacetime curvature, conspansion requires a yet higher order of generalization in which even relativistic properties, e.g. spacetime curvature inhering in the gravitational field, can be progressively rescaled.  Where physical fields of force control or program dynamical geometry, and programming is logically stratified as in NeST, fields become layered stacks of parallel distributive programming that decompose into field strata (conspansive layers) related by an intrinsic requantization function inhering in, and logically inherited from, the most primitive and connective layer of the stack.  This "storage process" by which infocognitive spacetime records its logical history is called metrical layering (note that since storage is effected by inner-expansive domains which are internally atemporal, this is to some extent a misnomer reflecting weaknesses in standard models of computation).

The metrical layering concept does not involve complicated reasoning.  It suffices to note thatdistributed (as in “event images are outwardly distributed in layers of parallel computation by inner expansion”) effectively means “of 0 intrinsic diameter” with respect to the distributed attribute.  If an attribute corresponding to a logical relation of any order is distributed over a

Page 14: Physics and Metaphysics - CTMU

mathematical or physical domain, then interior points of the domain are undifferentiated with respect to it, and it need not be transmitted among them.  Where space and time exist only with respect to logical distinctions among attributes, metrical differentiation can occur within inner-expansive domains (IEDs) only upon the introduction of consequent attributes relative to which position is redefined in an overlying metrical layer, and what we usually call “the metric” is a function of the total relationship among all layers. 

The spacetime metric thus amounts to a Venn-diagrammatic conspansive history in which every conspansive domain (lightcone cross section, Venn sphere) has virtual 0 diameter with respect to distributed attributes, despite apparent nonzero diameter with respect to metrical relations among subsequent events.  What appears to be nonlocal transmission of information can thus seem to occur.  Nevertheless, the CTMU is a localistic theory in every sense of the word; information is never exchanged “faster than conspansion”, i.e. faster than light (the CTMU’s unique explanation of quantum nonlocality within a localistic model is what entitles it to call itself a consistent “extension” of relativity theory, to which the locality principle is fundamental). 

Metrical layering lets neo-Cartesian spacetime interface with predicate logic in such a way that in addition to the set of “localistic” spacetime intervals riding atop the stack (and subject to relativistic variation in space and time measurements), there exists an underlying predicate logic of spatiotemporal contents obeying a different kind of metric.  Spacetime thus becomes a logical construct reflecting the logical evolution of that which it models, thereby extending the Lorentz-Minkowski-Einstein generalization of Cartesian space.  Graphically, the CTMU places a logical, stratified computational construction on spacetime, implants a conspansive requantization function in its deepest, most distributive layer of logic (or highest, most parallel level of computation), and rotates the spacetime diagram depicting the dynamical history of the universe by 90° along the space axes.  Thus, one perceives the model’s evolution as a conspansive overlay of physically-parametrized Venn diagrams directly through the time (SCSPL grammar) axis rather than through an extraneous z axis artificially separating theorist from diagram.  The cognition of the modeler – his or her perceptual internalization of the model – is thereby identified with cosmic time, and infocognitive closure occurs as the model absorbs the modeler in the act of absorbing the model. 

To make things even simpler: the CTMU equates reality to logic, logic to mind, and (by transitivity of equality) reality to mind.  Then it makes a big Venn diagram out of all three, assigns appropriate logical and mathematical functions to the diagram, and deduces implications in light of empirical data.  A little reflection reveals that it would be hard to imagine a simpler or more logical theory of reality.

 

The CTMU and Quantum Theory 

The microscopic implications of conspansion are in remarkable accord with basic physical criteria.  In a self-distributed (perfectly self-similar) universe, every event should mirror the event that creates the universe itself.  In terms of an implosive inversion of the standard (Big Bang) model, this means that every event should to some extent mirror the primal event consisting of a condensation of Higgs energy distributing elementary particles and their quantum attributes, including mass and relative velocity, throughout the universe.  To borrow from evolutionary biology, spacetime ontogeny recapitulates cosmic phylogeny; every part of the universe should repeat the formative process of the universe itself. 

Thus, just as the initial collapse of the quantum wavefunction (QWF) of the causally self-

Page 15: Physics and Metaphysics - CTMU

contained universe is internal to the universe, the requantizative occurrence of each subsequent event is topologically internal to that event, and the cause spatially contains the effect.  The implications regarding quantum nonlocality are clear.  No longer must information propagate at superluminal velocity between spin-correlated particles; instead, the information required for (e.g.) spin conservation is distributed over their joint ancestral IED…the virtual 0-diameter spatiotemporal image of the event that spawned both particles as a correlated ensemble.  The internal parallelism of this domain – the fact that neither distance nor duration can bind within it – short-circuits spatiotemporal transmission on a logical level.  A kind of “logical superconductor”, the domain offers no resistance across the gap between correlated particles; in fact, the “gap” does not exist!  Computations on the domain’s distributive logical relations are as perfectly self-distributed as the relations themselves.

Equivalently, any valid logical description of spacetime has a property called hology, whereby the logical structure of the NeST universal automaton – that is, logic in its entirety - distributes over spacetime at all scales along with the automaton itself.  Notice the etymological resemblance of hology to holography, a term used by physicist David Bohm to describe his own primitive nonlocal interpretation of QM.  The difference: while Bohm’s Pilot Wave Theory was unclear on the exact nature of the "implicate order" forced by quantum nonlocality on the universe - an implicate order inevitably associated with conspansion - the CTMU answers this question in a way that satisfies Bell's theorem with no messy dichotomy between classical and quantum reality.  Indeed, the CTMU is a true localistic theory in which nothing outruns the conspansive mechanism of light propagation. 

The implications of conspansion for quantum physics as a whole, including wavefunction collapse and entanglement, are similarly obvious.  No less gratifying is the fact that the nondeterministic computations posited in abstract computer science are largely indistinguishable from what occurs in QWF collapse, where just one possibility out of many is inexplicably realized (while the CTMU offers an explanation called the Extended Superposition Principle or ESP, standard physics contains no comparable principle).  In conspansive spacetime, time itself becomes a process of wave-particle dualization mirroring the expansive and collapsative stages of SCSPL grammar, embodying the recursive syntactic relationship of space, time and object. 

There is no alternative to conspansion as an explanation of quantum nonlocality.  Any nonconspansive, classically-oriented explanation would require that one of the following three principles be broken: the principle of realism, which holds that patterns among phenomena exist independently of particular observations; the principle of induction, whereby such patterns are imputed to orderly causes; and the principle of locality, which says that nothing travels faster than light.  The CTMU, on the other hand, preserves these principles by distributing generalized observation over reality in the form of generalized cognition; making classical causation a stable function of distributed SCSPL grammar; and ensuring by its structure that no law of physics requires faster-than-light communication.  So if basic tenets of science are to be upheld, Bell’s theorem must be taken to imply the CTMU. 

As previously described, if the conspanding universe were projected in an internal plane, its evolution would look like ripples (infocognitive events) spreading outward on the surface of a pond, with new ripples starting in the intersects of their immediate ancestors.  Just as in the pond, old ripples continue to spread outward in ever-deeper layers, carrying their virtual 0 diameters along with them.  This is why we can collapse the past history of a cosmic particle by observing it in the present, and why, as surely as Newcomb’s demon, we can determine the past through regressive metric layers corresponding to a rising sequence of NeST strata leading to the stratum corresponding to the particle’s last determinant event.  The deeper and farther back in time we regress, the higher and more comprehensive the level of NeST that we reach, until

Page 16: Physics and Metaphysics - CTMU

finally, like John Wheeler himself, we achieve “observer participation” in the highest, most parallelized level of NeST...the level corresponding to the very birth of reality.

 

Appendix A

Analysis is based on the concept of the derivative, an "instantaneous (rate of) change". Because an "instant" is durationless (of 0 extent) while a "change" is not, this is an oxymoron. Cauchy and Weierstrass tried to resolve this paradox with the concept of "limits"; they failed. This led to the discovery of nonstandard analysis by Abraham Robinson. The CTMU incorporates a conspansive extension of nonstandard analysis in which infinitesimal elements of the hyperreal numbers of NSA are interpreted as having internal structure, i.e. as having nonzero internal extent. Because they are defined as being indistinguishable from 0 in the real numbers Rn, i.e. the real subset of the hyperreals Hn, this permits us to speak of an "instantaneous rate of change"; while the "instant" in question is of 0 external extent in Rn, it is of nonzero internal extent in Hn. Thus, in taking the derivative of (e.g.) x2, both sides of the equation

y/x = 2x + x

(where  = "delta" = a generic increment) are nonzero, simultaneous and in balance. That is, we can take x to 0 in Rn and drop it on the right with no loss of precision while avoiding a division by 0 on the left. More generally, the generic equation

limxH0Ry/x = limxH0R[f(x +x) - f(x)]/x 

no longer involves a forbidden "division by 0"; the division takes place in H, while the zeroing-out of x takes place in R. H and R, respectively "inside" and "outside" the limit and thus associated with the limit and the approach thereto, are model-theoretically identified with the two phases of the conspansion process L-sim and L-out, as conventionally related by wave-particle duality. This leads to the CTMU "Sum Over Futures" (SOF) interpretation of quantum mechanics, incorporating an Extended Superposition Principle (ESP) under the guidance of the CTMU Telic Principle, which asserts that the universe is intrinsically self-configuring.

In this new CTMU extension of nonstandard analysis, the universe can have an undefined ("virtually 0") external extent while internally being a "conspansively differentiable manifold". This, of course, describes a true intrinsic geometry incorporating intrinsic time as well as intrinsic space; so much for relativity theory. In providing a unified foundation for mathematics, the CTMU incorporates complementary extensions of logic, set theory and algebra. Because physics is a blend of perception (observation and experiment) and mathematics, providing mathematics with a unified foundation (by interpreting it in a unified physical reality) also provides physics with a unified foundation (by interpreting it in a unified mathematical reality). Thus, by conspansive duality, math and physics are recursively united in a dual-aspect reality wherein they fill mutually foundational roles.

[If you want to know more about how the CTMU is derived using logic and set theory, check out these on-line papers:

http://www.ctmu.org/CTMU/Articles/OnAbsoluteTruth.html

http://www.ctmu.org/CTMU/Articles/IntroCTMU.html

Page 17: Physics and Metaphysics - CTMU

I'm currently working on additional papers.]

 

Appendix B

Because the value of R can only be theoretically approximated, using R or even R-1 to describe requantization makes it appear that we are simply using one theory to justify another.  But the R-to-R-1 inversion comes with an addition of logical structure, and it is this additional structure that enables us to define a high-level physical process, conspansion, that opposes gravity and explains accelerating redshift. Conspansive requantization is uniform at all scales and can be seen as a function of the entire universe or of individual quanta; every part of the universe is grammatically substituted, or injectively mapped, into an image of its former self…an image endowed with computational functionability.  To understand this, we must take a look at standard cosmology.

Standard cosmology views cosmic expansion in terms of a model called ERSU, the Expanding Rubber Sheet Universe.  For present purposes, it is sufficient to consider a simplified 2-spheric ERSU whose objects and observers are confined to its expanding 2-dimensional surface.  In ERSU, the sizes of material objects remain constant while space expands like an inflating balloon (if objects grew at the rate of space itself, expansion could not be detected).  At the same time, spatial distances among comoving objects free of peculiar motions remain fixed with respect any global comoving coordinate system; in this sense, the mutual rescaling of matter and space is symmetric.  But either way, the space occupied by an object is considered to “stretch” without the object itself being stretched. 

Aside from being paradoxical on its face, this violates the basic premise of the pure geometrodynamic view of physical reality, which ultimately implies that matter is “space in motion relative to itself”.  If we nevertheless adhere to ERSU and the Standard Model, the expansion rate (prior to gravitational opposition) is constant when expressed in terms of material dimensions, i.e., with respect to the original scale of the universe relative to which objects remain constant in size.  For example, if ERSU expansion were to be viewed as an outward layering process in which the top layer is “now”, the factor of linear expansion relating successive layers would be the quotient of their circumferences.  Because object size is static, so is the cosmic time scale when expressed in terms of basic physical processes; at any stage of cosmic evolution, time is scaled exactly as it was in the beginning.

The idea behind the CTMU is to use advanced logic, algebra and computation theory to give spacetime a stratified computational or cognitive structure that lets ERSU be “inverted” and ERSU paradoxes resolved.  To glimpse how this is done, just look at the ERSU balloon from the inside instead of the outside.  Now imagine that its size remains constant as thin, transparent layers of parallel distributed computation grow inward, and that as objects are carried towards the center by each newly-created layer, they are proportionately resized.  Instead of the universe expanding relative to objects whose sizes remain constant, the size of the universe remains constant and objects do the shrinking…along with any time scale expressed in terms of basic physical processes defined on those objects.  Now imagine that as objects and time scales remain in their shrunken state, layers become infinitesimally thin and recede outward, with newer levels of space becoming “denser” relative to older ones and older levels becoming “stretched” relative to newer ones.  In the older layers, light – which propagates in the form of a distributed parallel computation – “retroactively” slows down as it is forced to travel through more densely-quantized overlying layers.

Page 18: Physics and Metaphysics - CTMU

To let ourselves keep easy track of the distinction, we will give the ERSU and inverted-ERSU models opposite spellings.  I.e., inverted-ERSU will become USRE.  This turns out to be meaningful as well as convenient, for there happens to be an apt descriptive phrase for which USRE is acronymic: the Universe as a Self-Representational Entity.  This phrase is consistent with the idea that the universe is a self-creative, internally-iterated computational endomorphism. 

It is important to be clear on the relationship between space and time in USRE.  The laws of physics are generally expressed as differential equations describing physical processes in terms of other physical processes incorporating material dimensions.  When time appears in such an equation, its units are understood to correspond to basic physical processes defined on the sizes of physical objects.  Thus, any rescaling of objects must be accompanied by an appropriate rescaling of time if the laws of physics are to be preserved.  Where the material contents of spacetime behave in perfect accord with the medium they occupy, they contract as spacetime is requantized, and in order for the laws of physics to remain constant, time must contract apace. 

E.g., if at any point it takes n time units for light to cross the diameter of a proton, it must take the same number of units at any later juncture. If the proton contracts in the interval, the time scale must contract accordingly, and the speed and wavelength of newly-emitted light must diminish relative to former values to maintain the proper distribution of frequencies.  But meanwhile, light already in transit slows down due to the distributed “stretching” of its deeper layer of space, i.e., the superimposition of more densely-quantized layers.  Since its wavelength is fixed with respect to its own comoving scale (and that of the universe as a whole), wavelength rises and frequency falls relative to newer, denser scales.

Complementary recalibration of space and time scales accounts for cosmic redshift in the USRE model.  But on a deeper level, the explanation lies in the nature of space and time themselves. In ERSU, time acts externally on space, stretching and deforming it against an unspecified background and transferring its content from point to point by virtual osmosis.  But in USRE, time and motion are implemented wholly within the spatial locales to which they apply.  Thus, if cosmic redshift data indicate that “expansion accelerates” in ERSU, the inverse USRE formulation says that spacetime requantization accelerates with respect to the iteration of a constant fractional multiplier…and that meanwhile, inner expansion undergoes a complementary "deceleration" relative to the invariant size of the universe.  In this way, the two phases of conspansion work together to preserve the laws of nature.

The crux: as ERSU expands and the cosmological scaling factor R rises, the USRE inverse scaling factor 1/R falls (this factor is expressed elsewhere in a time-independent form r).  As ERSU swells and light waves get longer and lower in frequency, USRE quanta shrink with like results.  In either model, the speed of light falls with respect to any global comoving coordinate system; cn/c0 = R0/Rn = Rn

-1/R0-1 (the idea that c is an “absolute constant” in ERSU is oversimplistic; like material

dimensions, the speed of light can be seen to change with respect to comoving space in cosmological time).  But only in USRE does the whole process become a distributed logico-mathematical endomorphism effected in situ by the universe itself…a true local implementation of physical law rather than a merely localistic transfer of content based on a disjunction of space and logic.  The point is to preserve valid ERSU relationships while changing their interpretationsso as to resolve paradoxes of ERSU cosmology and physics.

In Noesis/ECE 139, it was remarked that if the universe were projected on an internal plane, spacetime evolution would resemble spreading ripples on the surface of a pond, with new ripples starting in the intersects of old ones.  Ripples represent events, or nomological (SCSPL-syntactic) combinations of material objects implicit as ensembles of distributed properties (quantum numbers).  Now we see that outer (subsurface) ripples become internally dilated as distances

Page 19: Physics and Metaphysics - CTMU

shorten and time accelerates within new ripples generated on the surface.

CTMU monism says that the universe consists of one “dual-aspect” substance, infocognition, created by internal feedback within an even more basic (one-aspect) substance called telesis. That everything in the universe can manifest itself as either information or cognition (and on combined scales, as both) can easily be confirmed by the human experience of personal consciousness, in which the self exists as information to its own cognition…i.e., as an object or relation subject to its own temporal processing.  If certain irrelevant constraints distinguishing a human brain from other kinds of object are dropped, information and cognition become identical to spatial relations and time.

In a composite object (like a brain) consisting of multiple parts, the dual aspects of infocognition become crowded together in spacetime.  But in the quantum realm, this “monic duality” takes the form of an alternation basic to the evolution of spacetime itself.  This alternation usually goes by the name of wave-particle duality, and refers to the inner-expansive and collapsative phases of the quantum wave function.  Where ripples represent the expansive (or cognitive) phase, and their collapsation into new events determines the informational phase, the above reasoning can be expressed as follows: as the infocognitive universe evolves, the absolute rate of spatiotemporal cognition cn at time n, as measured in absolute (conserved) units of spacetime, is inversely proportional to the absolute information density Rn/R0 of typical physical systems...i.e., to the concentration of locally-processed physical information.  As light slows down, more SCSPL-grammatical (generalized cognitive) steps are performed per unit of absolute distance traversed.  So with respect to meaningful content, the universe remains steady in the process of self-creation.

© 1998-2002 by Christopher Michael Langan

 

Partial Bibliography

1.    Exploding stars flash new bulletins from distant universe

       James Glanz in Science, May 15, 1998 v280 n5366 p1008

2.    To Infinity and Beyond

       Robert Matthews in New Scientist, Apr 11, 1998 p27

3.    Astronomers see a cosmic antigravity force at work

       James Glanz in Science, Feb 27, 1998 v279 n5355 p1298

4.    The Theory Formerly Known as Strings

       Michael J. Duff in Scientific American, Feb 1998, v278 n2 p64

5.    Exploding stars point to a universal repulsive force

Page 20: Physics and Metaphysics - CTMU

       James Glanz in Science, Jan 30, 1998 v 279 n5351 p 651

6.    Mind and Body: René   Descartes to William James

       Robert H. Wozniak, 1996 <www.brynmawr.serendip.edu>

7.    The Mind of God: The Scientific Basis for a Rational World by Paul Davies

       Simon and Schuster, 1992

8.    A Brief History of Time by Stephen W. Hawking 

       Bantam Books, 1988 

9.    Quantum Reality: Beyond the New Physics by Nick Herbert

       Anchor Press/Doubleday, 1985

10.  Cosmology: The Science of the Universe by Edward R. Harrison

       Cambridge University Press, 1981

11.  The Mathematical Experience by Philip J. Davis and Reuben Hersh

        Birkhäuser, 1981