Application: Omnistory
For Mathematicians
Where Now?
About the Author



A Quick Look at Reflexivity


The Theory of Reflexivity is a scientific concept that attempts to explain all that has ever existed and all that will ever exist, in terms of a simple easily understood process, that of optimized reflexive information wave dynamics.


The Theory of Reflexivity is based exclusively on observations of the nature of our universe. It uses these observations as a starting point for the establishment of generally applicable materialist and positivist theoretical concepts.


The process for the conceptualization of the theory derives from the idea of reverse engineering. It is based on the identification of certain significant features and events from the present and the past and the subsequent creation of a theory that links and explains them.


The present state of the universe is known to a certain extent. It exists at scales from very small to very large. These scales are characterized by certain processes, quantum mechanical at very small scales and relativistic at large scales. At the middle scale we find life, consciousness, self-awareness and civilization. Furthermore certain obvious historical signposts that one could expect to encounter on a journey from the past to the present, can be identified. These include such things as, the beginning of the universe, the beginning of life in the universe, the beginning of consciousness and self-awareness and the beginning of civilization.


With these ideas in mind a theory can be created. Initial conditions can be established. These must be very simple, yet at the same time allow for creativity and flexibility. It is also necessary to establish a simple process, the heart of the theory, which can be applied to these initial conditions. Once the process is established it must be consistently applied in a way that allows us to take a journey from the beginning of time (assuming that there was one) or at least a very distant past, to the present. It must do so in a way that not only allows us encounter and explain the historical signposts we have identified but also brings us to a point where we can explain present observations.


Reflexivity freely incorporates ideas from other theories and principles, in particular:


Information Theory – Information processing


Cybernetic Theory – Reflexivity, Meta System Transformation


Wave Theory – Wave dynamics


Action Principle - Optimization


Indeed it can be said that that every aspect of this theory, from the observations upon which it is based, to the four concepts mentioned above, and to many others, including phase transition and evolution, have been extremely well known for at least fifty years. In other words the explanation for everything has been staring at us now for several decades. In order to see it we need simply to arrange the pieces in a way that forms the picture.


This is the merit of the Theory of Reflexivity. It provides a simple concept around which these ideas and others can be assembled.


The theory has three parts. The first, The Motor, describes the process that makes the mutiverse work. The second, Potentials, describes the possibilities for creation inherent in the theory. The final part, Limits, outlines the nature of the restrains that restrict the Potential.


Reflexivity can be thought of as an operating system running on a computer that consists of the universal medium. This operating system creates an infinitely dimensioned, infinitely varied, twisting, churning, intensely dynamic multiverse that has always existed and will always exist.


The Theory of Reflexivity holds that our universe is but one of an infinite number of universes in this multiverse. The flexibility of the central feature of this theory, that of optimized reflexive information wave dynamics, is applied in an attempt to explain of all observed phenomena in our universe. It is proposed here that optimized reflexive information wave dynamics account for the creation of the universe through the Big Bang. The proposal accounts for, incorporates and is fully consistent with quantum mechanics and relativity.  It attempts to explain the existence of life and consciousness through a new subordinate Theory of Bio-Consciousness. It attempts to explain self-awareness through a new subordinate Theory of Self-awareness and it attempts to explain the existence of civilization through the new subordinate Theory of Civilization.


The Theory of Reflexivity posits that these phenomena are fractal in nature. They display self-similarity with their reflexive resonance in all scales and in all dimensions. Indeed the multiple informationally interlocked nested resonances of this tumultuous world can be described as creating the music of the spheres.


Reflexivity also has a dual nature based on bits and waves, which is similar to the dual nature, based on quanta and waves, that characterizes quantum mechanics. Is Reflexivity a general statement of quantum mechanics that applies to all scales and possible to the multiverse? Is quantum mechanics merely the special application of Reflexivity at small scales.


Reflexivity is presented here as a scientific theory. This claim to scientific status is based on a number of points. First, the theory is the picture of simplicity. It is sweeping in scope. It is materialistic and positivist. The theory is internally consistent. It can be expressed mathematically and finally it suggests avenues of experimentation for its own validation.


As outlined in the Preface, the Theory of Reflexivity not only provides a new way of understanding the larger past but it also has predictive features that can be useful in speculating about the future. It allows the study of both the past and the future and the telling of the story of both within a consistent intellectual framework. In short it creates a new discipline, ”omnistory”.


It is important to mention what the Theory of Reflexivity is not. It is not a “religious” or “spiritual” concept. The Theory of Reflexivity is silent with respect to the existence of God and the role that God may or may not have played in the origins or evolution of the multiverse. Reflexivity is utterly observational, materialist and positivist in nature. The theory rejects outright the existence of any ghost, soul or spirit as part of the explanatory process.


The internet was an essential tool in the writing this work. The first draft was composed in Chelsea, Quebec during the summer of 2005 using internet research on a wide variety of topics including cosmology, cybernetics, quantum mechanics, evolution and consciousness. Chelsea is near Ottawa, a city which overflows with scientific documentation available at its two large universities and multiple government research agencies. In spite of this availability and access, the internet provided convenience and search power that greatly facilitated the production of the first draft.


A more refined draft, the one which forms the basis for this web site was written on Mahe Island in the Seychelles during January and February 2006. The Seychelles are a tropical Eden but they are poor in documentation of the type that is necessary for the writing of this sort of presentation. Once again the internet became an indispensable tool by providing access to the research material necessary to rewrite the original draft. Furthermore the internet has allowed the electronic publishing of this document from the Seychelles so that its contents may be shared with many others. The first inhabitants of Eden arriving in 1770 would have been amazed by this development.


Go With the Flow – But Be Ready for Surprises on the Way

(The Theory of Reflexivity – An Information (Thermodynamic) Philosophy of Nature)



Jeff Atkins

Chelsea, Quebec

Saturday 10 October 2009




Nature continuously optimizes itself by computing itself (reflexivity).


The universe, people and everything else are the result of a continuous 13.7 billion year old universal computation that continues to this day.


The computation began by the differentiation of an original medium, a kind of blank slate, about 13.7 billion years ago. This blank slate was first differentiated by an initial phase transition and an infinite number of subsequent phase transitions have continued to differentiate the results ever since, creating the world we see around us.   


Information flow powers these phase transitions. The original medium was packed tight with information looking for somewhere to go and something to do (maximum enthalpy).  A little kick got it going and the first phase transition, the Big Bang, freed up much of this densely packed information which started a wavy flow, like water down a hill, powering all subsequent phase transitions as it went.


When all the information reaches the bottom of the hill and the flow stops, everything will collapse. 





The universe, people and everything else are the cumulative result of a 13.7 billion year old universal computation (the Langton Computation) that continues to this day. In much the same way that the computers we are familiar with calculate the problems we give to them, so too nature calculates its problem. What is that problem? Nature must find a way to optimize itself. But what is nature you may ask? And whatever it is, how can this be optimized? Increasingly nature is being thought of as information and the best way to optimize information is to process it.


If nature is to process its problem, it needs a computer. What is that? It is nature itself. Nature processes itself by treating itself as its own computer and using its own information to run this computer. This is reflexivity. And nature has its own special central processing unit (CPU), known as a phase transition. This is a natural physical process that is nature’s way of creating new things. A phase transition takes something that already exists and transforms it into something new through a natural computation (the Langton Computation). We see it every day when water freezes to form ice or when a caterpillar turns into a butterfly. We feel a phase transition when we wake up or fall asleep.  We can see the power of some phase transitions in the explosion of an atomic bomb.


Information Flow and Phase Transitions


Phase transitions are the creators of the natural world. Everything that we see around us has been created by an infinite number of phase transitions. Phase transitions do four important things. In no particular order, they liberate a deluge of vibrating information particles that flow like water down a hill. This free flow in turn establishes three metaphorical channels which do three essential things simultaneously. The first channel is the creative channel and provides the information to power and compute the process for the new phase (the Langton Computation). This uses up information (entropy) and can be of short duration. The second channel maintains the process that supports the continued existence of the new phase that has just been computed by the Langton Computation. It does this in two ways. First it provides a continuous a flow of new information to continuously fuel or maintain the phase process. And second it continuously flushes away the used information that comes from this maintenance process (entropy). Without this continued flushing or cleansing, entropy would accumulate like poison and quickly bring the phase process to an end. The new phase would die. The third channel continues to flow free, within a buffer, ready to power new phase transitions.


This is all well and good, but why do the vibration information particles flow in the first place? It is the result of an event 13.7 billion years ago that continues to this day. The vibrating particles made up an original medium of pure information. They had (and have today) high potential, pressure, energy…choose the analogy you prefer. This blank slate of information particles was differentiated by the first phase transition and the information particles were released and began to flow toward the realm of low potential, pressure and energy, like water flowing down a hill. As the information flowed it powered other phase transitions which created the entire world that we see.  


More About Phase Transitions – The Resonance induced absorption of information and Resistance induced radiation of information by a medium


Because this article is about information it would be a good idea to describe what I mean by the term. For present proposes I consider information to be a potential, three types of potential to be specific: the potential to be transformed, the potential to be the transforming agent or the potential to have been transformed. Information can be any or all of these things. I should also mention reflexivity and why reflexivity is the means by which nature optimizes itself. The idea is simple. Reflexivity keeps processing distances short, minimizing waste (entropy) due to transmission loss during processing. (Any action has a cost known as entropy). 


Phase transitions begin with wavy information flow. Flowing wavy information imparts resonance to the surrounding medium and this induced resonance causes that medium to self-absorb some of the information from the flow (the way a TV picture is formed when a TV circuit resonates with the TV signal). The medium itself may be resistant to some of its information and may self-radiate information back into the flow. Any time something is done, there is a cost, called entropy. There is no free lunch. This resonance induced self-absorption of information and resistance induced self-radiation of information are basic features of information processing. The continuous flow of information creates a collision (a computation) between the self-absorption of information and the self-radiation of information forms the dynamic that is the basis for optimization of information processing that creates the universe.


Causes of Phase Transitions


Information/media have a kind of comfort zone and are able to absorb or radiate free information as long as its total information content remains in this zone. Phase transitions occur because the wavy free information absorbed or radiated by an information/medium rises above or falls below the upper or lower limits of this comfort zone. As mentioned above the information/medium has three parts, one fixed and two flow related. The fixed item is the buffer for free information. The two flow related items are, first the process/maintenance flow that characterizes the information/medium and second the flowing free information stored in the buffer.


What about the buffer? As mentioned it has a fixed storage capacity for the flowing free information, with maximum and minimum limits.


When the information/medium self-absorbs information through wave-induced resonance this free information is stored in the buffer. The characteristic process/maintenance is not affected. When an information/medium self-radiates information due to wave-induced resistance the free information that is radiated comes from the buffer. The characteristic process is not affected.


However when the information/medium absorbs more free information than the buffer can hold it spills over into the maintenance/process. This causes the process to undergo the Langton Computation which creates a new information/medium with a new maintenance/process that has a higher information content and a new empty buffer.


In the same way, when the information/medium radiates more information than is stored in the buffer, some is taken from the maintenance/process. This causes the process to undergo a Langton Computation which creates a new information/medium with a new maintenance/process that has a lower information content and a new full buffer.


The Langton Computation


So, what is a Langton Computation? A phase transition is nature's primary computer. The computation that it undertakes is known as the Langton Computation (author’s name) (Langton 1990). The Langton Computation calculates a number of things.


First it computes the properties or process for the new medium. It can do this by creating a whole new process that is characteristic of a whole new medium. Or it can filter out some features of the old process. Or it can do both and combining the result. When features are filtered out the process is called symmetry breaking. Where the resulting medium has some properties in common with the originating medium we say that the symmetry of those properties has been maintained.  


The Langton Computation also frees the wavy flow of information to continuously support the continued existence of this new medium (remember, without waves the medium neither absorbs information nor radiates information). It also calculates the parameters for the new buffer. These include the lower and upper limits for the information storage capacity of the free information that the buffer will hold. They also include the nature of the type of the information that the buffer will hold, including orders of freedom for the free information.  


Finally the Langton Computation calculates the amount of wavy free information that will be stored in the buffer and allows that information to flow into the buffer.


The Langton Computation is the most wonderful process in nature. It is the basis for the creation of everything that we see around us, including ourselves.




Phase transitions are not deterministic. Stated another way, the Langton Computation is not deterministic. It is incredibly mysterious. For example, it is impossible to predict by examining the medium itself, if it will undergo a phase transition and the associated Langton Computation, in a changing environment. If it does undergo a phase transition it is impossible to predict by examining the medium itself or the environment, the conditions that will induce the phase transition. Furthermore, it is impossible to predict by examining the medium itself or by examining the environmental conditions that induced the phase transition, the nature of the medium that will be created by the phase transition. Worse than that, once the phase transition has occurred it is impossible to explain, after the fact, why the new medium is characterized by its unique new properties. The mystery of the phase transition is total. These things mentioned here can only be known through observation and noted.


This has important consequences for our understanding of the natural world. For example, it is impossible to predict that solid water will become liquid under certain circumstances (assuming of course that this phenomenon has never been observed). All we can do is observe the transition from solid to liquid and note the existence of the new phase. I argue here that such things as, space, time, energy, nucleons, atoms, stars, galaxies, bio-consciousness, self-awareness and civilization are created by phase transitions. If this is the case, then the creation of these new phases by the Langton Computation can never be explained, even now, after the fact. The emergence of these new phenomena can only be observed and noted. 


The Langton computation is the most mysterious phenomena in nature. In many ways it is nature’s way of covering its tracks, of hiding how it creates new things. It imposes the limit to our knowledge of the natural world. But if there are limits to what we can know, at least we know why we cannot know.


The First Phase Transition – The Big Bang


Where and when did all these phase transitions start? They started with an original medium. This medium, a kind of blank slate, was differentiated by the first phase transition, which we call the Big Bang. The Big Bang initiated a huge flow of free information that formed the three metaphorical channels, mentioned above. The first channel provided the power for the Langton Computation that created what we see as space and time. The creation of space and time continues to this day, a fact that has implications that are discussed later. The second channel was formed to continuously maintain the existence of space and time and the third channel of free information was formed to power new phase transitions.


Time-Out for Space Time


There are a number of ideas relating to the nature of space-time. The most familiar is Einstein’s Theory of General relativity. But there are others.


One comes from String Theory and proposes that space-time has ten dimensions. In our universe we only observe four of these ten, three of space and one of time. According to String Theory the other six are curled up into tiny knots that have yet to be observed.


Another, “Twister Theory” was devised by Roger Penrose in 1967. According to this view, space is not the unified collection of all points in three dimensions that we intuit, but rather is based on a collection of points of light which represent a network of causal relations among events in space time. The complete theory has yet to be written. 


Then there is “Non-Communicative Geometry” Developed by Alain Connes in 1988 and based on the idea that one region of space time can have an influence on other regions.


“Causal Dynamic Triangulation” a real mouthful, was developed jointly in 2002, by Jan Ambjorn and Ranate Loll. Based on Einstein’s Theory of General Relativity, it describes space-time as being composed of miniscule tetrahedral “pixels”.


Finally “Loop Quantum Gravity” elaborated in 1990 as a result of dissatisfaction with String Theory, by Lee Smolin, Abhay Ashtekar and Carlo Rovelli, offers the view that space is quantified and composed of tiny loops 10/-34 metres in diameter.


The Matter of Anti-matter


The flow of free information released by the Big Bang, powered a second phase transition, computing quantum fields with associated particles of matter in the form of fermions (quarks and leptons) and energy in the form of bosons. It is also possible that particles of what is referred to as dark matter were also formed. It appears, unlike the creation of time and space, that the creation of matter and energy was a one time event and that the total of both remains constant to this day. The second channel of information began to flow and perform its continuous two part task. First it provided the information to continuously maintain matter and energy. The flow also simultaneously and continuously flushed away the entropy produced by their creation. The third channel was also established, with free flowing information ready to create new things through phase transitions.


The universe today is dominated by matter. However very early in the new universe this was not the case, at least according to the Standard Model of cosmology. In this model the early universe consisted of both matter and anti-matter. However there was slightly more of the former than the latter. They are mirror images of each other with one exception…charge. For example, an electron has a negative charge while an anti-electron, also known as a positron, has a positive charge. When a matter particle collides with an antimatter particle they annihilate each other and release energy. In the fractions of a second following the big Bang, two phase transitions, resulted in the elimination of all anti-matter in the universe and created the matter dominated universe that we see today. The first, at 10/-4 seconds after the big Bang consisted of a quark/anti-quark encounter which resulted in the annihilation of all anti-quarks. Because quarks were slightly greater in number than anti-quarks, there remained a few after the phase transition. They went on to form nucleons, (neutrons and protons) mentioned later. But it did not end here. At one second after the Big Bang a second phase transition, another annihilation event, occurred. This time electrons and anti-electrons (positrons) reacted with all anti-electrons being annihilated. Because electrons were slightly greater in number there remained a few after this second phase transition. They went on to join the nucleons and form atoms.


Matter is Created Once – Space is Created Continuously   


The Big Bang phase transition that created space and time is very special because it is continuous (the Langton Computation is continuous). In recent years the expansion of space has been observed to be accelerating and the cause has been ascribed to an unknown process powered an unknown force called “dark energy”. It is interesting to speculate on the idea that this acceleration of the expansion of space may be due to the release of free information from the original medium, the blank slate, which gives a boost, so to speak to the creation process of space. Should this be the case, our universe would not be the closed system that many think it is. No, it would be open, with free information flowing “in” from some “outside” source. Such a state of affairs would have astonishing implications for such things as laws of conservation and communications. 


Furthermore the continuous expansion of space may end up diluting the information density of space. As more and more space is created, the amount of free information that is liberated may find the increasing volume of space so empty there is nothing to create. Remember. The creation of space and time is continuous but the creation of matter and energy was a one time event. Once it was created, that was it. No more. As space continues to expand, this matter and energy is diluted. As it becomes increasingly diluted the new information released by the continuous creation of space time, cannot be absorbed by its surroundings and induce a new phase transition because the new surroundings are essentially empty of matter and energy. The flow of information becomes too diffuse to induce any Prigogine Phase Transitions (see below). This new information although it continues to flow, flows down a bottomless well. Nothing new happens, space continues to expand becoming progressively emptier…information death. How dull.  


Interlude – The Blank Slate – Maybe Not So Blank


While we talk of the Big Bang as a singular event I present the idea that it is the local example of an infinite number of such events in a class of phase transitions that I call the Big Bang class. These have created and will continue to create multiple universes, which form the multiverse, by continuously differentiating the original undifferentiated medium. Our universe is the result of one such event. This idea goes by the name of “Eternal Inflation”.   


But it may be more complicated that that. There are other views of what may have preceded the Big Bang and what nature might have looked like before it.


One of these is based on the concept of quantum mechanics. This “Parallel World” view was proposed by the American physicist Hugh Everett in 1957 and invokes the superposition of quantum states. According to this view all possible worlds exist at the same time and in the same place! The world we see is based on “experience”. It is the “experience” of the observer that creates the world. Perhaps this “experience” invokes a phase transition that causes a particular world, one among an infinite number, to make itself available for observation.


A third view is based on concepts from the theory of quantum gravity. According to this idea, as matter approaches the centre of a black hole, gravity becomes repulsive and pushes the matter away from the centre. Matter bounces back. This creates a new universe inside the black hole itself. As a result each black hole becomes the home of a new universe. Perhaps this rebound is the result of a phase transition.  


The final idea is more prosaic. It is based on General Relativity and the idea that the geometry of our universe is flat and infinite. We are limited in our observation of the universe by the speed of light which is finite and can therefore only see a small part of infinite whole. An infinite number of universes exist beyond this “light horizon”. The diameter of each would be centred on the point of observation and would expand with the time spent observing from that point. However all universes would have the same physics as ours. No phase transitions would be involved in this instance, merely a change of the location of observation.   


Prigogine Phase Transitions: Self-dissipation – Optimizing the Small and Large Scales


As the density of free information decreases and entropy increases in the ever expanding universe it would seem that the prospects for the creation of structures of increasing order are not bright. However a quick look around us demonstrates that order and structure are commonplace. They are everywhere! How can this happen if the universe appears to be running out of gas. Well, the flow of free information is the key. Up to this point (the present day) in the history of the universe, neither the decreasing density of matter, energy or free information nor increasing entropy have been of any consequence as relates to the creation of order. They have been overcome by the continuous flow of free information. This flow flushes away the entropic poison which is a by-product of the phase transition. This fortunate turn of events has been explained by the Russian physicist Prigogine and is the hallmark of what he calls a dissipative system. This fortunate process is at the heart of a new class of phase transition, the Prigogine Phase Transition (author’s name) that emerged shortly after free information released by the Big Bang, began to flow.


The First Prigogine Phase Transitions occurred shortly after the Big Bang Phase Transition. Remember, this transition created the information flow in the form of the fields associated with fermions and bosons, including gravity that we see as matter and energy. In the case of matter, these Prigogine transitions took these smaller components (fermions or more specifically quarks) and combined them, creating the sub-atomic particles…neutrons and protons. The process was calculated by a Langton Computation.


The continuing flow of information released by the Big Bang provided the information for the maintenance of the structure of these protons and neutrons. It also continuously flushed away the entropy associated with the buzzing quantum field of quarks that make up both particles. Following the creation of sub-atomic particles further Prigogine Transitions created neutral hydrogen atoms. This transition, which occurred about 400,000 years after the Big Bang, is known as “recombination” and it created a dark universe, opaque to the passage of photons. The period that followed is known in cosmology as the “Dark Ages”.  A further Prigogine Phase Transition known as “reionization” occurred between about 150 million and one billion years after the Big Bang and once again allowed the passage of light.


As space expanded these hydrogen atoms came under the influence of gravity, itself created by a phase transition as mentioned above. This induced a new series of Prigogine Phase Transitions powered by the flow of information (gravity) that formed the structures of the cosmos. The differentiations of stellar types (and associated planets) that form the transitions that mark the passage of a star through phases of the Hertzsprung-Russell diagram are each the result of Prigogine Phase Transitions. This is seen in the formation of a protostar from a stellar nebula and its progression through various stages in the main Hertzsprung-Russell sequence, to become a red giant, and variously a planetary nebula, white dwarf or a black dwarf. So too do Prigogine Phase Transitions create the larger scale structures of the universe including, galaxies, clusters super-clusters and larger phenomena. This has occurred and continues to occur,  through the dissipative “decoupling” of gravitationally sensitive normal (baryonic) matter from gravitationally insensitive dark matter that characterizes the Prigogine Phase transition. .  


This growth and differentiation of stars by Prigogine Phase Transitions also resulted in differentiated atomic structure through recursive rounds of phase transitions seen as nuclear fusion in the centre of different star types. This process created all the elements of the periodic table. These new elements were often scattered throughout space in star explosions or supernova, themselves the colossally energetic result of phase transitions, in the form of a spectacular failure of a Prigogine Phase. This is because a hallmark of a Prigogine Phase is its ability to dissipate entropy, thus ensuring stability. In the case of an exploding star, a sudden increase in entropy overwhelms the dissipative process. Boom!


WKJG (Wachtershauser Kauffman Jordan Ghin) Phase Transitions - Auto-catalysis – Optimizing the Middle Scale


This is a mouthful. What do these do? They create bio-consciousness from a non-bio-conscious environment.


The “Bio” of Bio-consciousness


As we have seen, phase transitions created the quantum world at the small scale and the cosmos at the large scale.  The creation of planetary bodies provided a place for new phase transitions to occur. Nature, always seeking a way to optimize its processing, found fertile new ground, creating in effect, the middle scale. The surface of at least one planet, this one, was a sea of zillions of atoms and molecules at a high energy state (Morowitz, Kauffman). More particularly their associated electrons were at a high energy state. All these electrons were looking for somewhere to go and something to do. Optimization is always the name of the game. Chemical reactions (themselves Progogine Phase Transitions) can be a helpful solution to the problem of moving electrons from a high energy state to a low one. But normal chemical reactions are relatively slow. Although such reactions occurred in this new environment, they did not relieve the tension. There remained an overwhelming crowd of electrons still looking for a way out, a fast lane to a lower energy state. Life provided a solution to the problem. Riding to the rescue come, Wachtershauser, Kauffman, Jordan and Ghin.


A closer look at the situation reveals this. Elements can combine to form a bewildering number of different types of molecules. Furthermore a number of molecules act as catalysts for the reactions whereby they themselves are created. In other words, some molecules are self-catalytic or auto-catalytic. At a point where the ratio of the number of possible reactions between molecules to the number of possible different types of molecules becomes greater one to two, an autocatalytic network emerges (Kauffman). Voila, the WKJG Phase Transition.


In other words, chemical networks that become self-sustaining, emerge at the middle scale. Indeed it is the emergence of these networks in this spatial realm that create the definition of middle scale. These networks had a huge variety of properties. Many were characterized by channels (electron transport chains) that that expedited the flow of electrons, with varying degrees of efficiency. Those that most effectively allowed the flow of electrons from a high energy state to a lower state came to dominate among the large number of network types. The most effective were those that catalyzed chemical reactions! We call these networks by their biological name…metabolism.


What is so good about metabolism? Why is it the best solution to the dissipation of electron energy, for the passage of an electron from a high energy state to a low energy state? Or stated another way, why is metabolism the best solution for the flow of information (electrons are a type of information) from a relatively higher potential to a relatively lower potential. There are a number of information processing features that favour metabolism. The first thing about metabolism is that it endures. It outlasts the one-time nature of a simple inorganic reaction for example. This provides a continuity of flow. The catalytic nature of the process, as well as the very short transmission distances involved, provide speed. Thus the express lane! The complexity of the process provides more bandwidth than simpler reactions. This combination of continuous information flow, speed, and bandwidth (complexity) allows a relatively more massive flow of electrons (information), via electron transport chains, than simpler chemical reactions. This speed and flow of rich information ignited the phase transition, the Langton Calculation that created bio-consciousness. It also provided the continuous flow to maintain the new phase, the metabolism. Furthermore the information flow flushed away the huge amount of waste information produced by the calculation of the process for the new phase (the metabolism) and produced as well by the continued maintenance of the new phase.


A quick word about the dominance of carbon in biology. Metabolism is characterized by two coupled processes, catabolism, the power source for building and anabolism, the building process itself. Carbon is an ideal building block for the anabolic process because while its outer electron shell has the ability to hold eight electrons, under normal circumstances it only has four. Due to the fact that chemical reactions involve the mutual sharing of electrons by atoms, carbon has place for four more electrons from different atoms, more that its competitors, hydrogen, helium, lithium, etc., before its outer electron shell is complete. Through the process of the natural selection of a variety of auto-catalytic anabolic structures, nature quickly favoured carbon based anabolisms as seen in the early presence of the Calvin Cycle, the reductive TCA Cycle, the 3-hydroxypropionate cycle and the acetyl–CoA pathway. This is because they favoured the efficient production of enzymes (proteins) that made their associated catabolic electron transport chains more effective than competing chains. Because they were more efficient in harnessing the power of their high energy electron environment they grew faster and more numerous and came to dominate the competition. 


Auto-catalytic processing can vary in intensity as it is influenced by, among other things, the nature of the information environment. This variation in intensity results, through a process of recursive auto-catalytic formation and natural selection (Kauffman), in what we call speciation. As a consequence, we see all the  different types of bio-conscious forms that have characterized the history of life, ranging from anaerobic prokaryotic bacteria, to large aerobic eukaryotes. 


Getting a Grip on Chirality


Handedness or chirality, is an important feature of chemistry. While some molecules are not physically symmetric in any way, other molecules can form in ways where an example of one is the mirror image of another. One side of the image is referred to as the “L” or left hand and the other as the “D” or right hand. Usually in reactions where molecules that display handedness are formed, equal numbers of molecules of both hands, “L” and “D” are produced. Where there are equal numbers of both the mix is said to be racemic and to display hetrochirality. 


However bio-organic molecules many of which are symmetric, are often not found in mixtures with equal numbers of each hand, but display a preference for mixtures of either one hand or the other. Such molecules are said to be holochiral and are produced in groups that are either exclusively “L” or “D” handed. The molecules in groups of either L or D handedness are said to be enantiomers. In particular life today consists of what are known as L-amino acids and D-sugars. The origin of this chirality remains one of the great mysteries of bio-chemistry.


However there is some speculation concerning the resolution of this difficult problem. Martin and Russell (Martin, Russell 2002) offer explanations for the formation of each of the molecules mentioned above.

In the case of the L-amino acid they state that a precursor enzyme (of the peptidyl transferase reaction) may act as a filter to produce only “L” handed amino acids. I speculate that that this enzyme performs a phase transition, the Langton computation of which always produces an L amino acid.


They offer a similar argument for the D-sugar. They suggest that in the Acetyl-CoA anabolic pathway, the enzyme enolase (a protein) or a precursor, catalyses D-sugars. Again I speculate that this enzyme performs a phase transition with a Langton computation that always produces a D-sugar.    


The “Consciousness” of Bio-consciousness


However there is more to auto-catalytic processing (metabolism) that biology. Jordan and Ghin state that the auto-catalytic process is also characterized by phenomenology. What we call consciousness.


I believe that the WKJG Phase Transition created a new phase that is both biological and conscious, material and phenomenological …in other words bio-consciousness. They argue that it is impossible to have metabolism without consciousness and that it is impossible to have consciousness without metabolism. They argue that this is so because the metabolism embodies the context of the environment from which it was created. And this creation was the result of a WKJG Phase Transition. What is it about this phase transition that embodies the environment and creates a phase that is both material and phenomenological? I speculate it is the auto-catalytic process. The process locks in the characteristics of the environment and internalizes them. This stands in contrast to the Prigigine Phase Transition that pulls in information from the environment but passes it through the system without any auto-catalytic processing. However as stated above, because of the mystery associated with the Langton Computation it may well be impossible to know how the abiotic becomes bio-conscious. 


However assuming the correctness of the assumption that metabolism is simultaneously material and conscious, what can we say about the consciousness of the first bio-conscious creatures? It is reasonable to speculate that it was certainly very simple. It would also be reasonable to argue that the consciousness of the first anaerobic bacteria for example, was to human consciousness as the metabolism of that bacteria was to human metabolism…in other words, very simple compared to stunningly complex. 


Is it possible to be more specific? Maybe. Perhaps bacterial consciousness can be described in terms of quality and quantity. First quality. In saying that the consciousness of a given metabolism (that of the first bacteria) was simple, it is reasonable to argue that it was undifferentiated. For example, it made no distinction between an inner and outer world. It merely felt what it was like to be bacteria, to use the language of present day researchers into consciousness. It experienced a very simple “something”. It is possible to speculate that while the experience, feeling or sensation of what it is like to be a bacteria, although undifferentiated for a particular bacteria, was the same for all the bacteria of the same family, all of those with the same metabolism. Furthermore it is possible to speculate that the consciousness a bacteria with a different metabolism was (is) qualitatively different from that of bacteria with other metabolisms.


Is it possible to say more about the quality of bacterial consciousness? Again, maybe! As stated above, the first metabolisms, through auto-catalysis, embodied the environment, both the specific environment from which they emerged and the larger environmental context that supported the existence of that specific environment (Jordan, Ghin). What might these original environments have been like? In describing them we can appreciate the physical nature of the metabolism as a first step in understanding the quality of bacterial consciousness.

 Wachtershauser speaks of a number of possibilities for such early environments that may have given rise to precursor metabolisms, consisting of an energy generation process, catabolism, powering a building process, anabolism. He speculates that electron energy from electron transport chains in sulfide based, iron-sulfur based and nickel based catabolisms may have powered anabolic building, with atoms of carbon from environmental CO, through a process described as the acetyl-CoA pathway,.


And the quantity of bacterial consciousness? Is there a way to appreciate that? Perhaps, if we look at the power source for anabolism we may gain some insight with respect to the “intensity” of bacterial consciousness.


There are a number of possibilities, ranging  through varying degrees of electrical potential (Gibbs Free Energy), from electrons provided by both inorganic and organic molecules as they passed through a variety of catabolic electron transport chains, to photons, which powered the first anaerobic photosynthesis, by means of the electron cycle of the associated Photosystem One. The relative power differentials of these energy sources may provide some appreciation of the quantity of bacterial consciousness.


It may be that successive recursive cycles built increasingly complex metabolic networks, differentiating both the biological and phenomenological parts of bio-consciousness. All this was done with electron flow (information flow) and building materials (hung mostly on a framework of carbon), both from the environment. Increasing amounts of waste chemicals, heat (entropy) were flushed away allowing the growing structure to endure. 


Bio-consciousness is a wonderful solution to the problem of optimization. It optimizes processing. It does this by creating, through a phase transition from a purely material world, a new phase which is simultaneously both a material and phenomenological. The medium embodies both materially and phenomenologically the environment from which it emerged. WOW!! But there is even more. Bio-consciousness fills an empty processing niche. Processing first created space. Space expanded and became characterized by scale. Optimization took advantage of the scale it created and began processing at short range (Plank distance) creating and defining the small scale, which we see as the quantum world. As space expanded, long range gravity based information processing began to occur, creating and defining what we see as the astronomical world. This new domain optimized itself once again, with the emergence of bio-consciousness creating and defining what we call the middle scale.


The concept of consciousness as a contextually emergent property of self-sustaining systems raises a number of important questions. For example, what does it mean for the development of artificial consciousness using the approach of chips and programming? The implications seem to be bleak. According to Jordan and Ghin, in order for consciousness to be part of an information processing system, the system must be first be self-creating, in order to embody the context of the environment and then it must be self-sustaining or auto-catalytic, to “lock in”, the properties of that environment. The technology for the creation of a context that allows the self-creation and self-sustenance of a computer is at present not available and will probably not be so, far into the future. However, using another approach that involves the ideas of Wachtershauser and Kauffman, and more specifically concept of auto-catalytic sets, (and by implication the ideas of Jordan and Ghin), scientists are exploring approaches to the experimental creation of bio-consciousness (life) in the laboratory. Chances for success seem much more promising.


What about the idea of the living universe, the living planet, the Gaia? As we have seen the essence of the WKJG phase, is the concept of bio-consciousness as the contextually emergent property of auto-catalytic systems. If it is conscious, it is alive. If it is alive it is conscious. You cannot have one without the other. Well it does not look like the universe or the planet is bio-conscious. These processes and structures are certainly self-regulating, self-organizing and self-dissipative, but they are not auto-catalytic. Furthermore they are characterized by processing distances that are too great, bandwidths to narrow and topologies to simple to allow the highly coloured information flow (high speed, high volume, and high complexity) essential to create and sustain bio-consciousness. Bye-bye Gaia!


Metzinger Phase Transitions – The Ultimate Middle Scale Optimization


Biology and consciousness have been differentiated through recursive rounds of WKJG Phase Transition and natural selection, a process we see expressed through speciation. The biology and the consciousness of different species have become larger, more energetic and more complex. With the progression of time, the differentiation of consciousness has manifested an increasingly wondrous number of features…the ability to “feel” disruption, alarm, emotion: the ability to distinguish between the inside and outside world: the ability to make internal representations of the inside and outside world: the ability to remember: the ability to integrate a variety of processes and make decisions: and the ability to know (cognition), among other things.


The drive to optimize, powered by electron (information) flow, created further differentiation. It created self-awareness. Self-awareness involves a differentiation of cognition. It is possible to be conscious without knowing that the consciousness that is experienced is one’s own. However the emergence of self-awareness through a Metzinger Phase Transition, introduced the ability to know that the consciousness we experience is our own. 


We intuit from their behavior, that insects are highly conscious creatures but that they are not self-aware. They do not know that the consciousness that they experience is their own. Humans do however know that their consciousness is their won. This involves the differentiation of cognition so that humans “know”, that the feelings, sensations and other aspects of consciousness are “their” consciousness.


According to Metzinger, this is the result of what he calls the illusion of self-awareness. He states that this is due to a number of brain processes. In his view the brain creates a highly dynamic body space, one’s own space,  that is like an empty shell, a hollow person, the surface of which coincides with the physical surface of our bodies and which moves when we move. He states that other parts of out brain create and consolidate into a seamless whole the “experience” we call consciousness. Finally, yet other parts of the brain cause the feeling to “fill”, or coincide or correspond to the hollow body space. This filling of the body space with consciousness creates what he calls the illusion of self-awareness. I speculate here that these processes are the results of what I call Metzinger Phase Transitions (MPT). A MTP creates the body space through the differentiation of a simpler feeling of body. Another MPT consolidates all the elements of our consciousness into a seamless whole. Finally a further MTP associates this seamless experience with the body space.


Should this be the case it would be the logical outcome of nature’s drive to optimize. It helps solve the local problem of an environment overloaded with high energy electrons looking for somewhere to go and something to do, by maximizing their passage through electron transport chains supported by auto-catalytic bio-consciousness. But more importantly it creates a new medium, that which we call self-awareness, that allows the universe not only to see and feel itself but also to know that this sensation is its own. It is nature’s drive towards optimization through reflexive information processing come full circle. The universe has created a way to experience itself.  


But it does not stop here. Large numbers of bio-conscious creatures, (people), interact and create new auto-catalytic information processing networks, which, through recursive phase transition based growth, and natural selection, differentiate and create what we call civilization.  


The End 


We have seen that as things are created some of the wavy information flow is used up (entropy) in the creation process (the phase transition). Some is also used to continually maintain the things that have been created (more entropy). But some remains unused and continues its wavy flow, (enthalpy), powering new phase transitions which continue to create new things. When all the information has flowed to the bottom of the hill (maximum entropy) everything will come to an end.


In their book “The Five Ages of the Universe”, Adams and Laughlin, describe this process in marvelous detail. The universe begins by building itself up, with two successive Positive Pirogine Phase Transitions, which create first the hot phase, and then the stelliferous phase, which is the phase in which we presently find ourselves. The universe then begins to fall apart and does so in three successive Negative Prigogine phase transitions. The first creates the degenerate phase. This is followed by the black hole phase and finally the dark phase.


The first two phases are the result of Positive Prigogine Phase Transitions, which successfully create and maintain structure and process by the constant dissipation of entropy. However the last three phases are the result of Failing Prigogine Phase Transitions, where structure and process slowly degenerate due to the progressive inability of the Prigogine Phase to dissipate an ever increasing flux of entropy. Where does this increasing entropy come from?


Remember that space is expanding and information density is decreasing. A point is reached where the information throughput in the Prigogine Phase is no longer sufficient to either maintain the phase or to flush away entropy. The entropy accumulates and the stelliferous phase begins to collapse, in three progressive phase transitions loosing enthalpic information all the way. The first transitional collapse is from the stelliferous phase to the degenerate phase. The second is from the degenerate phase to the black hole phase. The third and final phase transition is from the black hole phase to the dark phase…the end!  


An Appreciation of the Theory of Reflexivity


It is possible to make a few simple comments about the Theory of Reflexivity.


-          The theory is simple. It states that nature optimizes itself through the continuous reflexive self-differentiation of an original undifferentiated medium by means of multiple self-organizing phase transitions powered by information flow.


-          The theory is sweeping in scope. It attempts to provide a coherent explanation for everything that has every existed and everything that will ever exist. 


-          The theory is internally consistent and can probably be expressed mathematically. However the discontinuities related to phase transition will be tricky to model. The mathematics of Langton, Prigogine, Kauffman and Boltzman among others may be helpful. It should be remembered that being internally consistent does not mean that something is right. The story of Goldilocks and the Three Bears is internally consistent, but of course it is a fairy tale. 


-          The theory makes predictions. For example one prediction is that bio-consciousness has been created zillions of times from a non-living environment beginning about 3.7 billion years ago and that this creation process continues right up to the present day. Life is being created all around us all the time, right now, because the same environment, that of high energy electrons looking for something to do, that existed billions of years ago, still exists today. Furthermore, this life creating process will continue until that environment no longer exists.


-          The predictions made by the theory can be tested. Using the example mentioned above, we can be inspired to look for examples of metabolism appearing abiotically in the world around us.


Other comments about Reflexivity can also be made.


-          The Theory of Reflexivity destroys the concept of cause and effect. This is because the output of reflexive processing can modify the transforming process that created it in the first place, in ways that cannot be predicted (the Langton Computation).


-          The effects of the Theory of Reflexivity are manifested in such phenomena as turbulence, complexity and chaos that abound in nature. 


-          The effect of the Theory of Reflexivity on the limits to knowledge is expressed mathematically in Gödel’s Theorem.


-          And finally, that which explains everything explains nothing. Because the definition of a phase transition is not precise, the concept itself may be useless. It may be possible to cherry pick events in a way that supports the idea presented here, when they are not phase transitions at all. It may be that the facts are conveniently (but unintentionally) twisted to fit the theory. Should this be the case the argument expressed in this article is based on a house of cards... I tried.


The infinite numbers of phase transitions that have created the world to this point and support its very existence, at this very moment, coupled with the reflexive uncertainty of each one of these transitions, create a reality that is a type of wonderful illusion. The result is that the world around us and indeed we ourselves are like a cloud, a ghost. 


Finally, as mentioned above, deeper implications relating to limits on our ability to understand nature, may be somewhat unsettling. It may be that no matter how hard we try, we can never know everything.  We will never have a full knowledge of the laws of the natural world. That is the bad news. The good news is that although we can never know everything, we at least know why we cannot know!


For Further Investigation


Web Site




Kauffman, Stuart, At Home in the Universe: The Search for the Laws of Self-Organization and Complexity, Oxford University Press, Oxford, 1995.


Metzinger, Thomas, The Ego tunnel: the Science of the Mind and the Myth of the Self, Basic Books, New York, 2009.




Klein, Martin J. (1973). The Development of Boltzmann’s Statistical Ideas, in E. G. D. Cohen and W. Thirring (eds). The Boltzmann Equation: Theory and Applications. Acta physica Austriaca Suppl. 10. Wien: Springer. pp. 53–106.


Floridi, Luciano, Introduction: What is the Philosophy of Information, 2002.


Jordan, J. Scott; Ghin, Marcello, (Proto-) Consciousness as a Contextually Emergent Property of Self-Sustaining systems, Mind& Matter, Vol. 4(1), pp. 45-68. 


Langton, Chris, Computation at the Edge of Chaos: Phase Transitions and Emergent Computation, Physics, D-42, North Holland, 1990.


Liko, Thomas; Wesson, Paul S., The Big Bang as a Phase Transition, Department of Physics, University of Waterloo, Ontario, arXix:gr-qc/0310067v2 13 Jan 2005.


Martin, William; Baross, John; Kelley, Deborah; Russell, John, Hydrothermal vents and the origin of life, Nature Reviews, Volume 6, November 2008, pp. 805-814.


Martin, William; Russell, Michael J., On the origin of biochemistry at an alkaline hydrothermal vent, Philosophical Transactions of the Royal Society, 2007, pp. 1887-1925.


Martin, William; Russell, Michael J., On the origin of cells: a hypothesis for the evolutionary transition from abiotic geochemistry to chemoautotrophic prokaryotes, and from prokaryotes to nucleated cells, Philosophical Transactions of the Royal Society, 2002.


Martin, William; Russell, Michael J., The rocky roots of the acetyl-CoA pathway, Trends in biochemical sciences, Vol. 29, No. 7, July 2004, pp. 358-363.


Morowitz, Harold: Smith, Eric, Energy Flow and the Organization of Life, 7 August, 2006.


Prigogine, Ilya, Time Structure and Fluctuations, Nobel Lecture, 8 December 1977.


Russell, Michael J., First Life, American Scientist, volume 94, January February 2006, pp. 32-39.


Wachtershauser, Gunther, Origin of Life: RNA World Versus Autocatalytic Anabolism, Prokaryotes, 2006, pp. 275-283.  


About the Author

Jeff Atkins was born in Windsor, Ontario in 1948 and completed high school there. He attended the Royal Military College of Canada in Kinston, Ontario graduating with an Honours BA in economics in 1971. He subsequently graduated from Carleton University in Ottawa with a Master’s degree in history in 1978. He served as a Logistics Officer in the Canadian Armed Forces from 1971 until 1975 including time at sea aboard HMCS Protecteur. From 1977 until 1990 he held various positions at the Canadian Radio Television and Telecommunications Commission including that of Director of Public Relations. He subsequently became the Manager of Operations at the Department of Transport Emergency Operations Centre in Ottawa.

He retired in 2003 after 35 years of military and public service. He was married for 30 years. He has travelled and lived all over the world and speaks fluent English and French. He is a licensed pilot and certified scuba diver. He presently lives near Ottawa. He has had a life long interest in physics and history and believes that the former is the ultimate research tool for the latter





Home | Preface | Introduction | Summary | Theory | Application: Omnistory | For Mathematicians | Interpretation | Critique | Where Now? | About the Author

This site was last updated 31/10/09