
Ken M. Anderson1*, Marvin Rubenstein1 and Minu Patel2
*Corresponding author: Ken M. Anderson Kanderso427@sbcglobal.net
1. Hektoen Research Institute, University of Illinois, Chicago, Illinois 60612, USA.
2. College of Nursing, University of Illinois, Chicago, Illinois 60612, USA and Bel-Air College of Nursing, Panchgani, Maharashtra, India.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
The consequences of interactions between deterministic and random events are increasingly referred to in the biomedical literature. Parallels have been drawn between, for example, their intersection and the evolution of life-forms that to a major extent are believed to be influenced by the randomness of chance. Similarly, the origin of malignancies currently is considered to depend in the main on mutational events, broadly defined but especially those involving elements of the genome. The stressing of non-linear, dynamic, physical or chemical systems beyond their equilibria, that can lead to the multiple bifurcations underlying "chaos theory" with its strange attractors and fractals, contribute in surprising ways to macroscopic and possibly even submicroscopic events. With an interest in the properties that these terms represent, able to influence events in biology, we cite some of the evidence available for their application in evolution and malignancy and suggest a hierarchical classification of functional change.
Keywords: Interaction, stochasticism, determinism, random
Increasingly the reader of biomedical literature can encounter the concepts of determinism, randomness, chance or stochasticism, deterministic chaos and their interactions, at times to the dismay of the reader. Dynamic, non-linear systems, bifurcations and period doubling, strange attractors, fractal geometry, the arrow of time, renormalization, path integrals, entropy and information theory can be alluded to in this literature. The relationships between these entities and their application to the physical and biological sciences often are not especially self-evident, yet a deeper understanding of them would require extended study, reliant upon an extensive knowledge of some complex mathematics and philosophy. Use of some of the concepts of chaos theory in what could appear to be unrelated disciplines such as weather forecasting, economic investment theory, population studies, evolutionary theory and even the initiation of heart beats in living organisms suggest their broad conceptual generality and potential importance for understanding more about the inorganic and organic worlds. The more prior exposure to the underlying mathematics and philosophy, the greater an understanding of what can be implied by these terms.
Despite these limitations we will summarize what is thought by many regarding how properties ascribed to the development and transfer of genomic information have been affected by the interplay between random and deterministic events, and even contributed to by deterministic chaos and fractal, self-similar systems, as these have impacted on evolution and on the more limited development of cancer. A view of functional changes dependent upon a hierarchical and stochastic application of random mutagenic events in cancer cells, likely in some circumstances overseen by elements of deterministic chaos, is advanced.
Definitions of Deterministic and Random (Stochastic or "Chance" ) Events
It is commonly understood that given sufficient information as to details, the occurrence of a deterministic event is predictable and reproducible within the experimental error. An identified precursor, stimulus (or stimuli en train) results in an expected outcome. As defined in standard dictionaries, a "cause" produces an effect. However, the idea of causation is not a simple concept, at times confused with correlation, generally to be distinguished by randomized controls. Other forms of causation are identified; domino, cyclic, spiraling, etc., potentially contributing to some deterministic outcomes [1]. Confusion from multiple events due to a common cause and those due to synchronicity, the appearance of events that have no identifiable causal relationship, need to be kept in mind. The linear, dynamic, deterministic equations underlying classical Newtonian physics are considered to be reversible. If the argument is carried to an extreme, since the past and present are considered to determine the future, with sufficient (infinite?) information, the future ought to be predetermined. For reasons to be cited, this view of a "clockwork universe" is considered an unrealistic over-simplification often in conflict with experience [2]. However, cause and effect are conceived of, either locally or if non-locally, in a field theory.
While random (stochastic) events are not subject to a similar predictability, they can be subject to estimates of their probability of occurrence. Macro-state randomness that we all can experience generally represents incomplete knowledge, inaccessible due to the inability to identify uncontrolled causes [3]. There seem to be several forms of randomness, macro-state and micro-state, depending upon context [4]. Originally, study of stochastic events was the province of mathematics and the theory of probability. The concept of stochastic optimization was related to the successful solution of equations encountered in this field [5]. Interestingly, stochastic circuits have been developed that generate fairly deterministic outcomes [6]. This includes so called "deterministic chaos" (see below).
Development of the quantum theory in the first half of the last century, with the apparent absence of determinism at atomic and sub-atomic micro-state levels, initiated a more nuanced understanding of the implications of determinism at a macroscopic level, for many, upending the determinism of classical physics [7].
In simplest form, an imaginary absolute specification of a deterministic outcome in an experiment requires a very large and really unknowable number of significant figures in the experimental data [2,8,9]. To the extent such precision is unattainable due to factors including minor changes in experimental procedure, historical contingency and the like, this uncertainty can be viewed as representing an element of randomness, even at the macro-state level. The inability to predict when a radioactive atom will lose radioactivity is the quintessential example of micro-state quantum indeterminism. While individual random disintegrations are unpredictable and irreproducible, the probability distribution of half lives of atomic disintegrations en masse are measurable.
The prior "state" of a system, its' history, would seem to somehow influence or at least provide the physical basis for the unknown probability of an individual stochastic event. As an approximation, the overall probability of deterministic to random events might be distributed along a continuum of very probable to very improbable, and their periodicities distributed from more to very much less periodicity. Definitions of determinism and randomness usually tend to be interpreted as in common usage when macro-state events all encounter every day are considered. In general, a deterministic system will exhibit an error rate that either remains small due to rounding or truncation "noise", or if it increases exponentially with time, has evolved into a chaotic system. In their purest form, the variance and mean should be identical, characteristic of a Poisson distribution. Stochastic systems are said to exhibit a randomly distributed error rate [10].
To summarize, seemingly random events can result in deterministic outcomes while nominally deterministic events can be associated with random events. A form of "duality" exists in which the two modes of behavior co-exist and occasionally are randomly co-expressed.
Quantum probability
Many atomic, nuclear and sub-nuclear micro-state events occur randomly [11]. Once again, some sort of a "law of large numbers" intervenes and at the macro-state level, randomness in the universe is as we perceive it [8,30]. Randomness associated with quantum events is not thought by some to be a question of insufficient information but rather some inherent proclivity for randomness. This seems to set quantum randomness apart from what we experience at macro-state levels [12]. The statistics employed depends upon context. Maxwell-Boltzman statistics are applicable to classical macro-state physical problems. Studies of Fermions (protons, electrons, neutrons and related "particles") employ Fermi Dirac statistics. Bosons (photons, the class of entities that comprise forces such as electromagnetism, the strong and weak nuclear forces) can be analyzed with Bose Einstein statistics. No two fermions can occupy the same angular momentum spin quantum number, the Pauli exclusion principal, that does not apply to bosons. The relationship between probability associated with deterministic classical macro-state mechanics and that associated with quantum mechanics is subject to differing interpretations [13]. One suggestion is that quantum theory applied to the macro-state environment represents a limiting case of the theory [14,15]. Another author, unknown, suggested that quantum mechanics was deterministic about probabilities. The wavelengths associated with particles are considered to be inversely related to their mass or their momenta (mass times velocity). Therefore, quantum effects on macro-state objects are imperceptible, according to this argument of de Broglie [16]. These unsettled differences persist in considerations of randomness, stochasticism and chance in macro and micro-states [17,18].
To attempt to resolve the interplay between deterministic and stochastic events at macro and micro-state dimensions, the Heisenberg uncertainty principle is relevant [15]. It can be stated as the inability to simultaneously and precisely specify the position and momentum of an atomic or sub-atomic particle, thereby rendering deterministic calculations of its future essentially impossible [2,15]. Plank's constant (6.626x10-34 m2 Kg/s) represents the proportionality constant relating the two parameters. At macroscopic dimensions, its effect normally is not noticeable. Is the appearance of an inherent randomness and the absence of causality in quantum mechanics independent of circumstances and context? We leave it to the reader to choose.
Chaos Theory, (in Brief)
Chaotic events differ from purely random ones in at least the following ways. They are irregular, yet ordered, associated with a fractal (self-similar) structure, especially if due to "strange attractors", very sensitive to the initial conditions and ultimately are deterministic [10,19,20]. Chaotic systems differ from truly random systems as they are deterministic. Complex systems differ from chaotic ones in that they involve dynamic interactions between large numbers of subunits exhibiting self organization, lack of equilibrium, openness to the environment and feedback [21]. Chaotic systems exhibit "topological mixing" as seen in the mixing of two colored dyes. Chaotic systems can be complex and complex systems can exhibit chaotic behavior.
The basic chaotic logistic difference equation includes a "driving parameter, r". Starting with a low fixed value of that parameter, the equation is "run" recursively [22]. As r is increased, it begins to oscillate between two values, termed a bifurcation. Bifurcations represent points at which behavioral changes can occur. Depending upon the values for r, a cycle of 2,4,8 etc. can appear and at values of r greater than 3.57 the output becomes chaotic. There is an r for which the sequence with a period of 3 oscillates between regular and chaotic cycles. Strange attractors, chaotic fractal systems strongly dependent upon initial conditions, are distinct from equilibrium and steady states with fixed-point attractors and periodic states with limit-cycle attractors sustaining periodic behavior in a system. States of matter beyond the first bifurcation and maintained far from equilibrium are termed dissipative states. Fractal geometry represents self similarity of irregular patterns at all recursive iterations. They tend to be scalable [23].
Chaotic systems are dynamic systems exhibiting sensitive responsiveness to initial conditions. While ultimately they are deterministic, sensitivity to initial conditions renders long term prediction so unreliable as to be impossible. This is thought by some not to be due to random behavior but to the nature of the basic logistic equation mandating responsiveness in unpredictable ways to minor changes in the apparent initial conditions [22,23]. Others believe that the element of randomness enters the longer the system is under study, and can eventually increase exponentially. Applications in physical and biological systems involving large numbers of interacting partners are numerous and include meteorology, finance, physics, biology etc. They seem to be summarized as multi-component, deterministic systems whose outcomes become unpredictable with time due to unusual sensitivity to minor differences in initial conditions. The application of Bayesian statistics to some of these problems in which an outcome can be altered by the introduction of estimates of changes; observable, latent, unknown or theoretical ones, occurring over time, seems to mimic chaos theory [24]. Finally, deterministic systems tend to have error rates (comparison of time series and test states) that are either small and stable or if chaotic, increase exponentially with time. A stochastic system would have randomly distributed errors [10]. Deterministic chaos is to be distinguished from stochastic chaos, the latter due to uncontrollable random fluctuations (noise) in the external environment [16].
To summarize, currently, many believe that the randomness of individual quantum mechanical events is neither deterministic nor predictable; randomness in chaos theory and in statistical mechanics is deterministic but not predictable [17,18,25,26]. If "hidden variables" were ever identified, determinism could be restored to quantum mechanics.
Statistical Mechanics, Probability, Entropy and the Arrow of Time (Very Brief Overview)
Statistical mechanics is a system for characterizing the average properties of very large collections, ensembles, of similar components with the use of probability theory [27]. Macroscopic (macro-state) systems can be characterized by measurable properties such as temperature, pressure and the like. Characterization of micro-state systems containing large numbers of identical components would require calculating the position and velocity, over time of each particle and their interactions, an unattainable task. If each micro-state is considered equally probable, statistical reasoning can be applied to calculate the probability that a particular macro-state is occurring. Non-equilibrium statistical mechanics involves ensembles that evolve over time. The probability of occurrence of a configuration of a macro-state is reflected in the ratio between its multiplicity and the total number of equivalent micros-states in the system. Entropy, the degree of disorder of a system, depends upon the multiplicity of its molecular or atomic configurations. The system's probability is related to the number of molecular configurations, their disorder and resulting entropy [27,28]. The larger the number of interacting components, the greater their entropy or degree of disorder and their probability. It is this connection between probability, thermodynamics and entropy that some consider as giving a direction to time, the "arrow of time" [16,24]. The increase in entropic events mandates a direction to time. This formulation has been disputed but provides a frame of reference for discussing such problems. A very complete, largely non-mathematical discussion of this topic is available in reference [16].
In this context the concept of path integral or the sum over histories developed by Feynman is relevant [29]. The experimental setup includes a series of electrons passing through a screen punctured by several small apertures that are imagined potentially to take all possible pathways to reach a recording film. The most direct pathways receive the maximum number of "hits", longer trajectories receive fewer ones. Pathways are subject to various forms of interference represented as vector sums of their wavelengths, some of which are constructive (in phase), others destructive (out of phase) to varying degrees. Although in principle all pathways are represented, only some yield a definite pattern, are in a sense positively deterministic in outcome. Translated, this becomes a statistical distribution of electron hits on a film appearing as a series of hills (constructive) and valleys (destructive), similar to interactions of waves in- and-out of phase. Superficially this result resembles the central limit theorem of statistics in which the unknown distribution of a very large population of particles or events can yield a definitive probability distribution [30]. Identically or non-identically distributed independent variables converge to a series of means, generating Gaussian or bell curves.
An important aside. Life forms represent highly ordered and apparently improbable collections of organic molecules. Happily, this does not violate the second law of thermodynamics, that natural processes tend to progress to greater disorder [27,28]. The "open" systems of life forms continuously excrete, or degrade relatively uncomplicated molecular weight compounds of lesser "order" and greater entropy than their precursors. Left behind are far more complex, functioning structures exhibiting reduced randomness and lesser entropy. At least for a time.
Types of Noise Affecting Biological Systems
Cellular noise is represented by random variability of biochemical events underlying the processes of organic evolution, cellular replication and metabolism. Random thermal motion is related to temperature and inter-molecular encounters depend in part on Brownian motion. Cellular noise has been classified as extrinsic or intrinsic [31]. An assay used to distinguish them employs two genes, one coding for a cyan-fluorescent protein, the other a yellow fluorescent protein, both driven by the same promotor. In single cells, extrinsic noise was identified when both genes and subsequently their protein products were activated; intrinsic noise as the activation of only one of them. Extrinsic noise has been related to variation in the numbers of polymerases, mRNAs or ribosomes affected and variability in rate constants as they interact. Extrinsic noise has also been defined as different responses of two identically regulated genes among different cells, dependent upon factors such as age and cell cycle behavior, physical environment etc.
Intrinsic noise was considered related to random events affecting transcription or translation. If a large number of components is involved in the molecular events potentially affected, the "law of large numbers" may obscure some of any noise [32]. When the number of components potentially affected is small, a unique event due to noise would be more likely to become apparent. Since most biological processes are not at equilibrium, anomalous results dependent upon non-equilibrium biological noise seem more likely than those associated with processes at equilibrium.
How many different types of noise need to be considered? Rounding errors are distinguished from other vagaries of data inputting. Are genomic or epigenetic mutations or related aberrations to be considered "noise", usually of an intrinsic form? Once a mutation has occurred, its effects should not be directly dependent upon current stochastic influences due to thermal or molecular interactions related to Brownian motion. Its' expression should be deterministic; perhaps "quasi-deterministic" if not fully so, especially if or when paired with one or more "interaction" partners. Noise, especially the intrinsic form, whether considered from mutations or not, could be useful, deleterious or probably much more often irrelevant for cells, of course, depending.
Cellular noise has been implicated in a number of biological effects. To list a few: gene expression levels, energy levels and phenotypic selection, stem cell development and differentiation, features of cancer treatment and cellular information processing [33].
Have both intrinsic and extrinsic noise contributed to evolution and cancer? Are most forms of noise non-stochastic but others stochastic; are most aperiodic, others less so? Could noise reduction reduce the incidence of "desirable, deleterious or irrelevant" mutations, perhaps even affect cellular aging or cancer? The role of noise in biology has been extensively studied with many unsettled questions [33-35, to list a few].
Stochasticity and the Law of Large Numbers Applied to Biological Events
At first view it might seem odd that random genomic events can lead to populations of cells exhibiting deterministic outcomes affecting their survival (see below). If affected cells initially are few in number, stochastic effects can be disproportionately influential. Although the final composition of a small population of replicating cells is uncertain due to genetic drift, with a change in the frequency of a gene-variant in a population due to random sampling, were proliferation of one clone sufficiently robust to outgrow their un-affected companions, it could eventually become the dominant clonal representative.
The "law of large numbers", essentially the addition of small Gaussian uncertainties [8,30], a statement of the central limit theorem of statistics, is often invoked to explain a reduced effect of aberrant cells on a population's subsequent composition. Individual variations, some related to noise, are summed in bulk, in a kind of mass-action averaging, the outcome however appearing as deterministic due to the large number of cells expressing behaviors close to some broad average. To employ an earlier analogy [36], individual molecules of water in a wave are distinctive in many ways, positions, momentum, number of interactions with other water molecules, etc., perhaps even somewhat chaotic, yet the wave eventually reaches the shore, a deterministic outcome. It does so due to an overall structure related to the relationships between the much larger numbers of more average molecules of water, a structure sufficient to dilute out effects of the lesser number of outliers, unless their effect or numbers were sufficiently disruptive to disturb the forces maintaining the behaviors of the much more numerous molecules tending toward the "average". This amounts to a representation of the central limit theorem of statistics. The general argument cited here seems a variant of the argument underlying statistical mechanics.
Mutational Events Provide The Primary Basis for Evolutionary Change and its Junior Companion, Malignant Change
We will not dwell on the considerable evidence supporting the idea that genomic mutation is the fundamental driving force of organic evolution [37-39]. In addition, evidence has accumulated that cell differentiation is a stochastic process guided by natural selection [40]. These ideas, coupled with a view that cancer development in some instances represents a blocked or a dys-differentiation [41], are all "in play". Complementing these foundational principals is the evidence in malignant change for both stem cell and somatic cell mutations, whether random in occurrence or induced by known carcinogens [42,43]. Such changes may occur in the stochastic somatic cell model or the hierarchical model involving stem cell oncogenesis.
In the case of epithelial cancers this often depends upon a sequence of genetic and/or epigenetic events occurring over a number of years [44]. As one of many other examples, primary melanomas have been shown to develop stepwise conservation of genomic changes over time [45]. In a study of 19 patients with metastatic melanoma, genotypic and phenotypic profiles of malignant, immune, stromal and endothelial cells were performed [46]. Transcriptional heterogeneity related to the cell cycle, spatial context and drug resistance of cells in the same tumor were present. Widespread intra and intercellular spatial, functional and genomic heterogeneity with active and dormant drug resistance was observed. For many cancers there is a relationship between the number of stem cell divisions and the chance of developing that cancer [47] and a relationship between the number of somatic mutations in cells and the number of their cell divisions [48].
Examples of Chaos Theory Contributing to Biological Processes
Chaos theory has been applied to a number of biological processes. The general form and structure of an ingrowing vascular structure [49] and the multiple bifurcations of the pulmonary system have been modelled with this theory [50]. In the pulmonary system, this implies a relationship with growth factors such as TGFB and FGFR, integrated in a program mandating a bifurcating, repetitively patterned architecture. The rhythm underlying repetitive heart beats is thought to be influenced by a chaotic component [51]. Ion transport systems and the development of the normal and malignant hematopoietic and the immune systems are subject to stochastic modulation [52-55]. The ability to introduce an element of randomness, possibly more "radical" if due to Level 3 or chaotic sequelae, in the response to variable physiologic requirements can provide additional flexibility for cells undergoing environmental or internal stresses. Stochastic changes in rate process could influence alternative responses to strictly deterministic ones. Stochastic determinism has been implicated in the development of hematopoiesis [56] and is implicit in both the hierarchical model involving somatic cells and the stem cell model of cancer development and progression [41,56].
Given the scalability inherent in chaos theory and the fractal character and self-similarity of "strange" attractors [57], do these properties provide a complementary "platform" on which to structure stochastic events of a different character from those of Level 1, 2 and 3 events, that together can promote cellular to organismic proliferation and survival. This could provide the capacity to respond at both levels to environmental or other stresses, while retaining most of the accumulated inherited history from their earliest evolutionary time.
A Hierarchical Classification of Random Genetic Change, Supplemented with Chaos Theory
We suggested [36] that random genomic changes could be classified according to a hierarchical scheme: Housekeeping (level 1), Tactical (level 2) and Strategic (level 3). Level 1 "Housekeeping" random events might include random events that affect basic functions of differentiated cells. Physiologic or random change in activity of a drug efflux mechanism, DNA repair or metabolic/energy provision could serve as representative examples. Interaction partners could include enzymes, signal transduction or other regulatory molecules or gene modules. Level 2 "Tactical" random changes would represent more complex random changes affecting signal transduction, DNA synthesis and replication and diverse cellular responses to increasingly severe stress. Mutations of oncogenes, suppressor, progressor or driver genes, epigenetic modifications and components of cellular control and differentiation represent potential targets. Cell regulatory nodes and synthetic and regulatory modules include higher order targets potentially susceptible to stochastic modification. Lastly, "Strategic" level 3 random changes would include extensive or even fundamental changes in gene expression, some likely of deep ancestral origin, whose disruption or other modification induces profound changes in multiple developmental "circuitry". Examples might include duplication of whole or large portions of chromosomes or introduction/ deletion/modification of regulatory circuitry, introducing fundamental changes in ongoing cell function. Presumably, level 1 and 2 events would be far more frequent than those of level 3. The categories could to some extent blend, depending upon the combinatorial possibilities presented initially or over time, as influenced by a form of "stochastic optimization"; i.e., the nascent clone either survives to procreate or dies. Initially many routine housekeeping events probably would involve fairly deterministic responses of rate constants, adjustments to substrates, and the like prior to any stochastic modification of their components or interactions.
It is unclear how best to integrate chaos theory with this scheme. The theory posits a deterministic outcome possibly based on sequential stochastic events that are considered scalable at all iterations. Major biological examples include the fractal nature of both the vascular system [49] and the pulmonary system, from bronchi to bronchioles [50]. One result of chaos theory seems to be the modification of development of systems of cells or other entities (financial system, weather, etc.) in extended and essentially repetitive patterns, over increasing time, punctuated by random, entirely unpredictable events. Some of these characteristics seem to scale to cellular and sub-cellular pattern formation. For example, are there repetitive chromatin substructures that undergo chaotic deterministic patterning? The generally inherent overall stability of cell lineages and much of their essential control and implementing molecular circuitry, dependent upon the vagaries of genomic events that developed over eons, supports the tendency toward considerable stability with retention of circuitry that "works", i.e., promotes continued cellular to organismal reproduction and survival. A lock and key analogy suggests itself. If the stochastic event is the "key", without a "lock" representing an "interaction partner", it is difficult to see how a deterministic change would occur. Conceivably the stochastic event might eventually initiate a series of sequelae leading to a newly developed "lock" capable of interacting with the "key", resulting in persistent developmental change. This should involve some delay while these changes occurred.
It is not certain how a stochastic genetic or epigenetic change leading to a loss of some crucial function might play out. If the function had served an important regulatory activity such as the loss of P53 in, for example, a malignant stem cell, that could promote the continued survival of a malignant clone. The rare reports of cancers apparently undergoing "spontaneous" cures might reflect a back-mutation or other loss of function of an oncogene or some related regulatory component of a malignant stem cell.
Properties inherent in chaotic systems may contribute to the underlying stability of cellular and subcellular function. Otherwise, some of the genetic heritage of the various organisms could be lost. Although subject to episodic random events potentially contributing to evolutionary development, overall a predominantly deterministic outcome reflecting significant contributions from the accumulated genomic history needs to be maintained in affected life forms if they are to persist.
The fractal nature of deterministic chaos's inherent logic seems to incorporate the statistical behavior of ensembles that include large numbers of interacting entities in repetitive and often scalable patterns. Details as to how this might integrate with other instrumental forms of cellular control do not seem to have been established but would seem to depend upon the nature of their "interaction partners". Curiously, its' "logic" applies to both the inanimate and animate worlds.
Some additional support for this general idea has been provided by a proof involving the "Ramsey theorem for pairs" [58]. Verbally it is interpreted as forming an unexpected connection between concepts of the finite and infinite. It implies that some degree of order can be nested within what appear to be chaotic systems. In other words, within any large, complex system there can be subsystems that retain more structure. This seems relevant for an understanding of deterministic chaos. There is a sense that this could even be related to Gaussian (bell-shaped) statistics and implicate aspects of the central limit theorem of statistics. Is the "true" mean of a large chaotic sample, sampled multiple times, related to a "deterministic" core?
Recently an additional connection between classical and quantum physics has been suggested by implicating chaos theory and quantum entanglement [59]. Superposition theory (that a particle can be located at several places simultaneously) and entanglement (that particles can be physically "linked", although located at considerable distances from each other), combined with classical "chaos" theory were employed to suggest a fundamental relationship between quantum theory and classical physics. The experimental system included 3 quantum "quibits" , somewhat analogous to classical binary "bits" (0 and +1) but exhibiting an additional superposition of the two states with their ability to undergo entanglement, yielding correlated measurements of multiple particles resembling classical macrostate physics. With electronic pulses, the resulting "entanglement entropy" of a quibit led to regions of their entanglement in the quantum system resembling regions of chaos in the classical system. Entanglement and chaos closely correlated when an equilibrium (thermalization) was reached. Presumably this has something to do with the result of, for example kicking a rock; you stub your toe.
The transition from quantum micro-state to classical macrostate events seems to depend upon the number of quantum particles as they interact with electromagnetic waves [60]. Large numbers of electrons at the temperature of liquid helium transferred to a cavity containing electromagnetic waves exhibited "strong couplings" in which changes in electromagnetic wave frequency and electron behavior consistent with the classical rather than quantum world were present. Seemingly another restatement of the "law of large numbers" [8,30]. Introduction of non-linearity with a "quibit" could reverse the system to a quantum state [60].
Certain species are believed to have evolved relatively little over evolutionary time, others are thought to be comparatively free from malignant change. Resistance to genomic change might in part be provided by unusually active DNA repair processes. Resistance to drastic evolutionary change could be contributed to by a predominantly deterministic regulatory substratum with more or less random access to chaotic events that in some sense provides a stabilizing platform of, for example, chromatin structure, yet includes flexibility due to stochastic (Level 1, 2, 3) and chaotic events. Since any mechanistic details of random events are unknown, until and if ever such can be identified, they can not be intentionally modified. Fractal dimensions of a proteins' interior [61] and of chromatin [62] have been described. It is of interest that the fractal complexity of DNA coding sequences was less than the non-coding ones [54].
There is an increasing understanding that randomness, stochasticism or chance is at the core of physical and biological existence. Without such interventions, it would seem that any organic evolution would have been an extremely slow and ponderous undertaking. Opportunities for limited to radical (punctate equilibria) reshaping of the biological world have been infrequent and possibly often ineffectual, due to limited combinatorial (level 1,2,3) options. Chance provides a means of randomly exploring much more robust solutions to exigent threats to survival. In addition to available motifs of change, levels 1, 2 and 3, if you will, it appears that further levels of logic layered over these perhaps more restricted modalities of change would be provided by the properties of chaos theory. The latter seems to impose and modulate developmental events that in biological organisms, organize and direct ensembles of cells functioning in collections directed toward the formation of complex structural elements required for integrated functioning of tissues and organs. Curiously, similar rules are applicable to collections associated with inorganic systems including weather forecasting, the stock market, populations of living organisms and physical or chemical phenomena included in a variety of inanimate events [19]. Once again, the effects of chance, even extending to the randomness of chaotic behavior, are delimited by overall deterministic outcomes. Are there other, deeper, higher- order forms of logical outcome that mandate still more complex inanimate/animate developmental relationships? Past developmental events based on randomness will have imposed a certain logic on the nature of the possible events to follow. If, as is suggested, the scalability of chaos theory applies to both intercellular and intracellular stochastic events, some of the latter may be regulated by the logic contained within that theory. At all levels, the role of chance intrudes, yet it's effects must be delimited by some form of deterministic outcome if they are either to be observed or to have any lasting effect.
In some sense it would seem that the capacity of purely physical systems to "evolve", adopting uniquely new forms of manifestation not previously available, would be much more limited than the great variety of combinatorial options available in the past to organic life - forms undergoing stochastic events which become available in the future. And yet, life - forms arose from inorganic chemical interactions. Man-made radioactive elements of atomic number greater than 92 might yield new physicochemical combinatorial options. Macro-state physical laws otherwise would at first view seem immutable, although some even challenge that view. Efforts to develop modified life forms based on altered genomic codes or nucleic acids suggest future potential organic developmental uniqueness [63,64]. Organic as distinct from any potential evolution of physical laws would seem to be radically different. Given the nature of the physical universe, as far as it is presently understood, presumably much of this could not have been otherwise.
The authors declare that they have no competing interests.
We thank the Seidel Family Trust and Dr. Jules Harris for their support. Thanks are due to Dr. George Dunea, Director of the Hektoen Institute and to Dr. Peter Hart, Director of the Renal Division and his staff at the Cook County Hospital, Chicago, Illinois for their continued support. We thank Kevin Grandfield of the University of Illinois College of Nursing for his editorial assistance.
Editor: Paul J. Higgins,Albany Medical College, USA.
Received: 12 September 2016 Revised: 28 October 2016
Accepted: 22 November 2016 Published: 30 November 2016
Anderson K.M, Rubenstein M and Patel M. The intersection of chance with determinism: definitions and an application. J Cancer Ther Res. 2016; 5:8. http://dx.doi.org/10.7243/2049-7962-5-8
Copyright © 2015 Herbert Publications Limited. All rights reserved.