You are on page 1of 49

Article List

29 November 2015
08:09 AM

Living Algorithm Articles (Key below)


1. Data Stream Mathematics: Requirements
2. Living Algorithm's Predictive Cloud
2a. Living Algorithm's Evolutionary Potentials
3. The Batting Average: Living Algorithm vs. Probability
4. Mathematics of the Moment (vs. Probability)
5. General Patterns vs. Individual Measures
6. Dynamic Causation vs. Static Description
7. Mathematics of Relationship
8. Precision vs. Fungible Meaning
9. Living Algorithm Algorithm
10. Mathematics of Informed Choice (vs. Deterministic Physics)
Expository Totals
Bio Totals
Narrative Totals
Grand Total

Stage

11
3
9
26

Living Algorithm Narratives (Key below)


1-2: Life searching for a Mathematics of the Moment
3-4: Is the Living Algorithm just an insignificant subset of Probability?
4-5: Probability challenges Living Algorithm's scientific credentials.
5-6: Probability's Numbers vs. Living Algorithm Patterns
6-7: From Causation to Relationship
7-8: Can the Living Algorithm provide Life with Fungible Meaning?
8-9: Could Life employ Living Algorithm to digest Data Streams?
9-10: Comfortable with Living Algorithm's Algorithm, Life wonders
about Choice.
10-Dyn1. Digestible Information & the Living Algorithm's birth
Totals

Headings
3/4
3/4
3/4
3/4
3/4
3/4
3/4
3/4
3/4
3/4
3/4

Edit
K
K
L
K
L
L
L
L
L
L
L

Pages
9
7
13
8
6
4
11
3
5
9
11
86
7
27
120

Stage Heading
s

Edit

Pages

K
L
L
L
L
L
L
L

2
2
3
3
5
2
2
3

5
27

Ads by DNSUnlockerAd Options

Living Algorithm Bios (Key below)


1. Data Stream's special Significance Evolution of Understanding
2: Living Algorithm's special Significance Evolution of
Understanding
9. Just following Directions
Totals

Stage Headings Edit

4
K

4
K

Pages
3
3

1
7

Living Algorithm Articles All (Key below)

Stage Heading Edit


s

3/4
K

1. Data Stream Mathematics: Requirements


The Mathematics of Living Systems Page 1

Pages
9

1B. Data Stream's special Significance Evolution of Understanding


1-2: Life searching for a Mathematics of the Moment
2. Living Algorithm System's Predictive Clouds
2B: Living Algorithm's special Significance Evolution of
Understanding
3. The Batting Average: Living Algorithm vs. Probability
3-4: Is the Living Algorithm just an insignificant subset of Probability?
4. Mathematics of the Moment (vs. Probability)
4-5: Life yearns for a Mathematics of Relationship
5. Mathematics of Relationship
5-6: Can the Living Algorithm provide Life with Fungible Meaning?
6. Precision vs. Fungible Meaning
6-7: Could Life employ Living Algorithm to digest Data Streams?
7. Living Algorithm Algorithm
7-8: Comfortable with Living Algorithm's Algorithm Life wonders
about Choice
8. Mathematics of Informed Choice (vs. Deterministic Physics)
9. Dynamic Causation vs. Static Description
9B. Just following Directions
Totals

4
4

3/4
4

3/4

3/4
4

3/4
4

3/4

4
3/4
4

3/4

18

3/4

K
K
K
K

3
2
6
3

K
L
L
L
L
L
L
L
L
L

8
2
6
3
15
2
6
2
8
3

L
L
D

11
10
1
99

Key
This is the article list for The Living Algorithm System the second monograph in the series associated
with the study of Behaviorial Dynamics. On this page the Reader will find 5 columns. The first column
(Table of Contents) is a list of the linked articles. The second column indicates whether the articles are
finished () or are still a work in progress (WIP). The 3rd column (Headings) provides a link to two
levels of outline detail. If the Reader is interested in Section headings, click on 3 and if interested in
Paragraph headings click on 4. The fourth column (Stage) indicates whether the articles have been
edited (L/K) or not (-). The final column (Pages) indicates the length of each article.

The Mathematics of Living Systems Page 2

Life's requirements for an Information Digestion System


29 November 2015
08:31 AM

Have you ever considered how we translate the impersonal digital information of 1s and 0s into personal knowledge that is rel evant to our
existance? For instance, how are we able to derive 'music' from our favorite CD? How is it that we dance wildly or cry uncont rollably
when we hear a sequence of 1s and 0s that can't even touch each other? What is the translation process that bridges the infin ite chasm
between these 2 simple numbers?

I am excited to present a plausible theory that accounts for our personal connection to the impersonal digital sequences cont ained on our
CDs, DVDs, computers and IPhones. The process that seems to enable connectivity is contained in a mathematical system
labeled Information Dynamics. The theory concerns how living systems digest digital information to transform it into a form that is
meaningful to Life. While the information processing epitomized by a computer is of necessity static, exact and fixed, living information
digestion is of necessity dynamic, approximate and transformational. Think of the difference between a baby and a computer.

Our initial monograph illustrated some of the many patterns of correspondence between the mathematical processes of Information
Dynamics and empirical reality. These include the harmful effects of Interruptions to the Creative Process, the negative impact of Sleep
Deprivation, the Necessity of Sleep, and even the Biology of Sleep.
These striking correspondences evoke some distinct questions. Why does the mathematical model behave in similar fashion to
experimentally verified behavioral and biological reality? Could these correspondences be a mere coincidence, some kind of od d artifact?
Or perhaps the striking patterns are due to some yet as undiscovered molecular/subatomic mechanism? Or could these odd correl ations
between mathematical and living processes be due to the process by which living systems digest information?
We chose to explore the last theory. The first question we posed ourselves: What kind of information digestion process would a living
system require? What are the entry level requirements?
The following essay addresses three questions. Why do living data streams best characterize the dynamic nature of living syst ems? Why do
data streams require a new mathematics? And what requirements must this data stream mathematics fulfill if it is also to be t he
mathematics of dynamic living systems?

Living Systems require New Mathematics of Data Streams


The highly respected Dr. Lotfi Zadeh is considered to be the father of the well-established fuzzy logic approach to engineering. Dr. Zadeh
co-authored the first book on linear systems theory in 1963, which immediately became a standard text for every engineering scho ol. His
prestige in the scientific community was sealed with this publication. Yet, he was already moving in a contrary direction. He was grappling
with the difference between living systems and material systems. Most of his colleagues, in their attempt to accurately chara cterize the
features of living systems, were pursuing a course that attempted to apply the mathematics of inanimate systems to what were becoming
known as animate systems. The effort to capture the unique features of animate systems with conventional mathematics resulted in evergreater levels of complexity and precision. He sensed that this effort by his colleagues to distinguish between 'animate' and 'inanimate'
systems was conceptually misguided. He had begun to realize that living systems are qualitatively different from non -living systems; and
this difference cannot be captured by the mathematics of inanimate sets no matter how complex. He recognized that a new
mathematics was required to articulate this qualitative difference. He published a paper in 1962 entitled From Circuit Theor y
to System Theory that foreshadowed his new perspective.
The Mathematics of Living Systems Page 3

to System Theory that foreshadowed his new perspective.


There are some who feel this gap [between 'animate' and 'inanimate' systems] reflects the fundamental inadequacy of the conv entional
mathematics the mathematics of precisely-defined points, functions, sets, probability measures, etc. for coping with the analysis of
biological systems, and that to deal effectively with such systems, which are generally orders of magnitude more complex than man-made
systems, we need a radically different kind of mathematics, the mathematics of fuzzy or cloudy quantities which are not descr ibable in
terms of probability distributions. (Dr. Bart Kosko, Fuzzy Thinking p.145 quoting from Dr. Zadehs paper.)
Dr. Zadeh reminds us that biological systems are an order of magnitude more complex than inanimate systems. This is neither t o diminish
the complexity, nor the conventional explanatory power of math or physics. But when we want to make meaningful statements abo ut these
biological systems with their attendant complexity, what can be said with the traditional standard of precision is limited. F or instance,
when we think about the creative act of writing, we realize that the invention and expression of complex ideas has a reality that goes far
beyond the ability to electrically map the brain.
We may be able to identify the electrical patterns of the brain with a high level of precision and still know very little abo ut the nature of the
ideas. The nature of ideas would certainly include elements such as invention, expression, relation and evaluation. Any satis factory
explanation of these elements must go beyond mapping electrical patterns. We need to grasp very nuanced meanings that come fr om a
nuanced understanding of context. Yet, this order of magnitude of complexity only addresses the individual writer. When we co nsider
some of the other relevant social variables connected with writing editors, agents, publishers and the reading public, the
magnitude of complexity of interaction of these living systems is staggering. To gain insight into how a particular author
invents, expresses and interacts with others, we need to know about the particular subtleties that are at work in that partic ular
case. This example is one illustration of Zadehs notion that biological systems have a level of complexity that far exceeds that
of inanimate systems.
Dr. Zadeh makes an important distinction between 'animate' and 'inanimate' systems. Fixed data sets typify 'inanimate' systems
systems which do not change with time and whose members are precisely defined. The probability distributions of
conventional mathematics work extremely well when dealing with these fixed data sets. The mathematics of probability processes this
data to produce familiar measures, such as mean and standard deviation, that accurately characterize the features of such a s et. The analysis
of these static and fixed data sets is a perfect way to discuss the general features of a fixed and static population an 'inanimate' system.
In contrast, when we consider the dynamic nature of biological systems, the precisely defined data set, with its attendant conventional
mathematics of probability, is inappropriate. The appropriate approach must address the fact that an organism, by definition, is in a
continually changing state that is neither fixed, nor can it be precisely defined. Living systems move through time and space , constantly
monitoring and adjusting in order to enhance the possibility of survival (as well as the achievement of any other higher orde r goals). It is
this aspect of biological systems that scientists hope to capture through an analysis of animate systems. The new mathematics
of animate systems must somehow articulate this qualitative difference between the static and dynamic features of existence.
There is another way of looking at data that better reflects the dynamic nature of living systems. This new method requires l iving data
sets which we choose to call data streams. The mathematics of living data streams must somehow address the information
flow of an inherently growing data set. Only in this manner will the new mathematics be able to describe the complex ongoing
relationship between organism and environment. Any animate system that hopes to simulate Life's dynamic nature must incorporate
this new mathematics of living data streams.

Data Stream Mathematics must address Lifes Immediacy


This new mathematics that addresses the data streams of living systems must go beyond traditional methods of characterizing d ata.
Conventional probability computes a single mean or standard deviation to characterize an entire set of data, where each set m ember is
weighted equally. The new mathematics of living data streams should not weight all members of the data stream equally. The mo st
relevant information regarding the well being of an organism is generally the most current information. Although past experie nce certainly
has relevance, more recent changes in conditions are likely to have an immediate impact upon the well being of the organism. For instance,
the average temperature for the day might be 65, but the more relevant information to an organism is that the current temper ature is 32. A
mathematics that weights recent input more highly is required to make meaning out of living data streams.
The nature of mathematical meaning as applied to this new interpretation of data streams is going to be qualitatively differe nt from the
nature of mathematical meaning for a fixed data set. Traditional probability distributions are an excellent articulation of t he general
meaning of fixed data sets. But this excellent articulation of overall averages inherently undervalues whatever significance may lie in the
pattern of immediately preceding events. Therefore the power of the conventional mathematics breaks down when it is not addressing a
fixed set with equivalent members. In fact, the only way that conventional mathematics can address a living data stream is by adding the
new data to the existing fixed set; and then treating the new set as an enlarged, yet fixed set, where all members are equiva lent. This
traditional approach trivializes the significance of particular moments in adynamic data stream.
Probability theory inherently ignores the immediacy of events. Probability is a big picture specialist. It characterizes the nature of the entire
set, and therefore undervalues the significance of the most recent environmental input. In contrast, living systems typically find it
pragmatic to weight recent experience more heavily (see above), rather than taking all the members of the set equally into ac count when
making computations. In short, the probability distributions of conventional mathematics are appropriate for dealing with fixed and
permanent data sets (where every member is weighted equally). However, traditional data set mathematics is inappropriate for dealing with
dynamic ongoing data streams (where the members are weighted in proportion to the proximity to the most recent data point). A s such, a
brand new type of mathematics is needed one that specializes in living data streams. This data stream mathematics must
somehow take lifes immediacy into account by weighting the most recent points more heavily.

Data Stream Mathematics must include ongoing Predictive Descriptors


The Mathematics of Living Systems Page 4

Data Stream Mathematics must include ongoing Predictive Descriptors


A mathematics that addresses immediacy will inevitably produce new measures that focus our attention on a particular moment, or series
of moments, in the data stream. Because this mathematics applies to data streams not data sets, these new measures must be qualitatively
different from traditional measures. To address the importance of recent events, these measures must describe the nature of t he most
current moments. These measures sacrifice big picture averages in order to more accurately describe the momentum of the momen t.
Probability inherently sacrifices the uniqueness of a particular moment by incorporating the individual data into an overall average of the
entire data set. Traditional measures, such as the Standard Deviation and the Mean Average, are therefore inadequate descript ors of a living
data stream. Rather than measures that describe the entire population, Data Stream Mathematics requires descriptive measures that focus
upon the immediacy of the moment.
To be useful to the organism these descriptive measures must include a predictive component. Probability's descriptive measur es (the mean
average and the Standard Deviation) allow scientists to make well-defined predictions regarding general populations. For instance,
Probability theory can predict the behavior of billions of subatomic particles with an amazingly high degree of precision. Si milarly the data
stream descriptors must allow us to make meaningful predictions about the next point in the stream. Making accurate descripti ons about
the most recent moments in the data stream enables the organism to make probable statements about the future. Probable statem ents
provide timely information to the organism. Absent this information about the nature of a particular moment(s) in time, the o rganism
would be flying blind as it confronted the environmental data stream. In essence, these probable statements serve two functio ns: 1) to
predict environmental behavior with greater accuracy, and 2) to utilize this information in determining a more appropriate re sponse. Any
data stream mathematics that hopes to simulate living systems must include descriptors of particular moments in time that als o allow us to
make useful predictions about the next point in the stream.
The predictive statements derived from these desired data stream measures are, however, likely to look fuzzier or cloudier than the
predictive statements derived from traditional data set measures. The meaning is likely to be ' fuzzy or cloudy' in that the predictions will be
suggestive rather than definitive. Practical considerations limit the predictive accuracy of the data stream's ongoing measur es. The
predictive statements of Probability mathematics are going to be definitive because they are applied to fixed data sets, whic h, by definition,
never change. The predictive statements of data stream mathematics are likely to be fuzzier or suggestive because they are ap plied to a
living data stream, which, by definition, possesses the capacity for constant change.
The new data stream mathematics will not satisfy the traditional predictive rigor demanded by Probability. Yet, this new math ematics will
complement traditional approaches by providing more powerful predictors about the immediate behavior of the data stream. The predictive
power provided by these data stream measures may well be more significant than overall statements about a growing (yet fixed) data set. A
mathematics that weights the immediacy of moment(s) can be a more useful predictor than traditional Probability mathematics ( where all
members of a fixed set are weighted equally). This new mathematical meaning may not fulfill the criteria for predictive preci sion
demanded by the mathematics of probability; but what it lacks in predictive precision, it more than makes up for by focusing upon the
relevance of more recent events.
Are these data stream predictors relevant to living systems?
Clearly there are times when an organism benefits from a heightened awareness of the immediate environment. Living systems ar e often
required to make instant responses to ongoing environmental input in essence, data stream(s). The changing conditions inherent
in the data streams of living systems often require an urgent response. The potential urgency of any reaction requires a
flexibility of interpretation and response that is sensitive to the momentum of the moment, or series of moments. There is
particular relevance to the organisms ability to predict the probable momentum of an ongoing series of experiences.
Probabilitys preoccupation with the general features of a fixed data set fails to capture the momentum of recent events. Any
Data Stream Mathematics that hopes to simulate this feature of living systems must somehow provide predictive descriptors
that characterize the probable momentum of the moment. As we shall see, this characterization should also reveal the
emerging pattern(s) in a series of moments in the life of an organism.
This analysis of two complementary mathematical systems raises some significant questions. What level of precision is reasona ble to
expect from the predictive measures of data stream mathematics? Does the distinction between data sets and data streams sugge st the
possibility of two different standards of precision when analyzing data? Can we tolerate two different standards of precision when
analyzing data? If the level of predictive accuracy for data stream mathematics is viewed as substandard by conventional math ematics, can
it still be useful? Is accuracy that is suggestive, yet not definitive, a useful predictor? For some preliminary answers to t hese questions, read
on.

Precision & Relevance Incompatible


Conventional mathematics may frown skeptically when confronted with the suggestion that not all significant patterns in Life can be
characterized by the precise standards of Probability. However, this presumed imprecision of data stream mathematics turns ou t to be an
asset, not a liability.
In terms of the organisms predictive powers, precision and meaning are inversely proportional the more precision, the less meaning,
and vice-versa. This notion may seem counter intuitive, yet in his study of system theory the aforementioned Dr. Zadeh turns
this concept into a principle - the principle of incompatibility:
{Zadeh] saw that as the system got more complex, precise statements had less meaning. He later called this the principle of
incompatibility: Precision up, relevance down. (Kosko, Fuzzy Thinking, p. 145, 1993)
In 1972, Zadeh articulated the principle even more clearly. Kosko quotes Zadeh:
"As the complexity of a system increases, our ability to make precise and significant statements about its behavior diminishe s until a
threshold is reached beyond which precision and significance (or relevance) become almost mutually exclusive characteristics. A
The Mathematics of Living Systems Page 5

threshold is reached beyond which precision and significance (or relevance) become almost mutually exclusive characteristics. A
corollary principle may be stated succinctly as, "The closer one looks at a real-world problem, the fuzzier becomes the solution." (Fuzzy
Thinking, p. 148, 1993)
If it is true, as Dr Zadeh argues, that real world problems require fuzzy solutions, data stream mathematics may provide a method to explore
this fuzziness.
The concept behind Zadeh's Principle of Incompatibility helps explain why the traditional laws of Probability find Life's Imm ediacy
perplexing. Probability is far more comfortable dealing with what is familiar his specialty. He feels most at ease with fixed,
unchanging data sets where all members are functionally equivalent. Perhaps his desire for the comfortable, yet rigid,
precision of conventional mathematics presents an insurmountable challenge to understanding the complexity of Life's
immediate meaning. For Life to have her spontaneous immediacy appreciated, she may have to search elsewhere for a
mathematical partner.
Perhaps understanding the immediacy of the moment requires a partner that relates better to a data stream. While this tradeof f sacrifices the
comfortable predictability of the more traditional relationship, it offers her a freshness that comes from more accurately un derstanding the
meaning of the moment(s) her most subtle nature. As we shall see, the suggestive predictors of Data Stream Mathematics with
their relative imprecision are an ideal match for characterizing her meaning of the moment Life's Immediacy.

Summary, Questions & Links


The esteemed Dr. Zadeh makes the claim that conventional mathematics with its precisely defined probability measures is inadequate
for coping with the analysis of biological systems. He further states that a new mathematics of fuzzy or cloudy quantities is required to
deal effectively with such systems. We suggest that a productive step in this direction is the study of living data streams, rather than fixed
data sets. Living data streams reflect lifes ongoing and immediate nature. A mathematics of living data streams may uniquely capture
these characteristic aspects of biological systems.
This new approach, however, has strict requirements. Is it reasonable to expect that a mathematics of data streams will satis fy all of the
criteria discussed above? Is it reasonable to expect this method to be: 1) current, 2) self -reflective of immediately preceding experience, 3)
responsive to pattern, 4) sensitive to any change in context, and 5) pragmatically predictive? What metaphor, other than math ematics, is
capable of systematically relating all these variables of living systems? What will this new mathematical metaphor look like? And if a
mathematical metaphor can effectively relate these criteria, could this metaphor actually be a mechanism that living systems employ to
process data streams? Is mathematics the information processing language of living systems?

The Mathematics of Living Systems Page 6

Living Algorithm's Predictive Cloud


29 November 2015
08:32 AM

Fuzzy Set Mathematics doesnt address Lifes Data Streams


2: Articles
3. Sections
4. Paragraphs

The prior article developed the notion that living systems must extract meaning from ongoing data streams to survive. Because of the
quantitative nature of data streams, we suggest that this meaning is mathematical in nature. This mathematical meaning has a few crucial
features. The mathematics must address the immediacy of living systems as well as providing predictive descriptors for each moment. As of
yet, traditional mathematical systems havent been able to meet this challenge.
The esteemed Dr. Zadeh, the father of Fuzzy Logic, recognizes this deficiency and offers his own mathematical system as a solution. Lets
see how successful this approach is. To set the context, our exploration begins by revisiting a previously-citedQUOTATION
from Dr. Zadeh:
(Due to) the fundamental inadequacy of the conventional mathematics the mathematics of precisely-defined points, functions, sets,
probability measures, etc. for coping with the analysis of biological systems we need a radically different kind of mathematics, the
mathematics of fuzzy or cloudy quantities which are not describable in terms of probability distributions. (Dr. Bart Kosko, Fuzzy Thinking
p.145 quoting from Dr. Zadehs paper.)
Inspired by this insight, Dr. Zadeh went on to formulate the concept of fuzzy sets. His followers turned this notion into the mathematics of
fuzzy logic. Engineers have successfully applied the insights of fuzzy logic to significant real-life problems, such as how to stop bullet
trains smoothly. Cognitive scientists have also employed the insights from fuzzy logic to simulate the neural networks of the brain. In
essence, fuzzy logic successfully introduces the both-and approach to data sets, a complement to the either-or approach of conventional
mathematics.
Zadehs call for a radically different kind of mathematics implies the need for some new kind of arcane operations or esoteric measures to
cope with biological systems perhaps a biological string theory, or a calculus of living systems, or the quantum mechanics of
behavior, or even fuzzy logic. However, with the exception of neural networks, fuzzy logic has not proved to be the new
mathematics that is necessary for coping with the analysis of biological systems.
It may be that the quest to discover a radically different kind of mathematics to analyze biological systems should not be limited to the
esoteric nature of high-level theoretical mathematics. We believe it would be useful to shift the mathematical focus to the inherent nature of
the subject matter at hand the data stream. Rather than pursue ever-more complex abstractions, we believe that the intelligent
application of existing mathematical tools can yield powerful, practical insights into the nature of data streams. This intelligent
application of existing mathematical tools must focus on the immediate significance of moments in the data stream.
We have argued previously that the dynamic nature of living systems is best characterized by data streams. We went on to detail the
requirements of a mathematics of data streams that would address Lifes immediacy. Hunting for a new set of mathematical abstractions
that would fulfill the necessary data stream requirements would be a difficult, if not hopeless, quest. Data Streams offer an extraordinary
challenge because of the inherently changeable nature of an open living system. Living systems can be extremely sensitive to every
interaction with an environment that includes not only the closed systems of inanimate matter, but also includes interactions with other open
animate systems. The complex web of interactions between living systems and the myriad data streams of existence suggests the arcane
approach of theoretical mathematics is highly impractical.
Rather than a higher and more complex level of abstraction, we are looking for pragmatic and useful tools that can help us think about
living data streams. Our search is for a practical mathematics. We appreciate the power of high-level abstractions. However, we believe
there is a place for the use of existing mathematical tools. The key is to apply these tools with a sensitivity to the unique nature of data
streams. The intelligent application of these tools can reveal insights into the nature of data streams, which are accessible to the informed
reader. These practical insights can provide a pragmatic balance to the rarefied language of theoretical mathematics.

The Quest for an 'Animate' System


We begin with this question: What is the relationship between organism and data stream? In other words, what actually occurs when a
biological system encounters a data stream? It is a compelling working hypothesis that organisms both ingest and digest the ongoing flow
of information. If an organism is merely ingesting data, then the data stream has little, if any, relevance. Common sense tells us that
processing data into some usable form must be a central purpose of an organism. The ability of organisms to monitor and adjust to the
inherently changeable nature of data streams depends upon an effective digestive system. Organisms do more than merely ingest data
they digest data. The question now becomes: How does this digestive system function?
Data Streams best characterize the dynamic nature of living systems. Organisms are in a continual state of interaction with data streams.
What could this process of interaction be, other than a perpetual state of ingesting and digesting the flow of information? The question is
not: How do data streams behave? But rather, we ask: What method does an organism utilize to digest the flow of information? The quest is
to find a method that simulates a system that digests data according to the criteria that we have established for a Data Stream Mathematics
of Living Systems. This method will define an 'animate' system of information processing. The test for this system is simple. How well does
it fulfill the prescribed requirements?
The requirements for the Data Stream Mathematics of Living Systems are straightforward, yet daunting. To be a successful candidate for
the position, the mathematics of the 'animate' system must effectively address the immediacy of dynamic living systems. This includes: 1)
weighting the elements of the data stream in proportion to their immediacy a sort of sliding scale that weights the present moment
more heavily; 2) providing descriptive measures that relate data points to each other in a manner that is sensitive to pattern
The Mathematics of Living Systems Page 7

more heavily; 2) providing descriptive measures that relate data points to each other in a manner that is sensitive to pattern
recognition; and 3) providing suggestive predictors that serve a pragmatic anticipatory function. These are the requirements
that a successful candidate must fulfill to be considered for the position. If the requirements are not fulfilled, the position will be
left open.
We would like to recommend a candidate for this long vacant, highly coveted and esteemed position. She's an excellent choice. She is a
simple form of information processing. Her sole function is to digest data streams. Further, her method of information processing generates
an 'animate' system, which fulfills the requirements of data stream mathematics. Her mathematics could be called the mathematics of the
moment, in that she effectively addresses Life's Immediacy. This includes providing a suggestive interpretative mechanism that articulates
pattern. The name of our candidate? You may have guessed it. Drum roll please . The Living Algorithm's Info System. The following
discussion provides evidence that supports our claim that this animate system, the Living Algorithm System, fits this demanding job
criteria and should be considered for the position. If her qualifications interest you, read on.

Living Algorithm fulfills Requirements


The conventional mathematics of Probability, whose specialty is fixed data sets, is unable to capture the immediacy of living systems. This
inability is due to a preoccupation with the average features of the entire set, rather than the unique features of particular moments. As such,
we must reject this applicant for the position. The dynamic nature of living systems requires a mathematics that encompasses the
immediacy of Life's data streams.
To accomplish this feat this new data stream mathematics must: 1) weight more recent data points more heavily and ) provide ongoing
predictive descriptors. It is easy to explain how the innate nature of the Living Algorithm System fulfills these two preliminary
requirements. The Living Algorithms sole function is to digest Data Streams. Ongoing raw data enters this mathematical system of
information processing. The Living Algorithm Family immediately: 1) weights the most recent data points in proportion to the current
moment in the data stream; and 2) transforms this external input into ongoing predictive descriptors. Accordingly, each 'moment' in the data
stream has its own predictive descriptors. (For the mathematics behind this verbal description check out The Living Algorithm.)
The third criterion for the position requires a mathematics that provides suggestive predictors that serve a pragmatic anticipatory function.
We believe that the Living Algorithms descriptive measures fulfill this difficult requirement. Justifying this claim is the quest of the
remainder of this article.

Living Algorithm System's Predictive Clouds


The Living Algorithm's sole purpose is generating the rates of change (the derivatives) of any data stream. This method of information
digestion entails turning precise data (instants) into an ongoing series of moments. These moments are characterized by their derivatives.
These derivatives reveal the trajectories of each moment by describing the current moment in relation to the preceding moments. Each
derivative has its own unique function. The Living Average (the 1st derivative) describes the relative position of each moment in the data
stream in relation to prior moments. We choose the phrase Living Average to represent a proportional weighting of moments whose impact
decreases over time. The Deviation (the 2nd derivative as a scalar, an undirected quantity) describes the relative range of each moment by
articulating the expected limits of the variation of pattern in the data stream. The Directional (the 2nd derivative as a vector, a directed
quantity) describes the relative tendency of each moment by articulating the expected direction of the momentum of a pattern. As a trio,
these descriptors characterize each individual moment in the data stream in relation to the preceding moments. In contrast, due to a
preoccupation with the general features of fixed sets, Probability actually ignores the existence of these moments in the data stream. A way
of visualizing this trio is shown in the following diagram.

These three descriptors simultaneously provide a prediction that amounts to rough approximations about the next data point: 1) the expected
position (the dot in the center), 2) the range of variation (the circle), and 3) the direction of momentum (the arrow). Accordingly, each of the
Living Algorithm's ongoing derivatives is a descriptor that contains a significant predictive feature. A simple combination of these
predictive averages creates a composite predictive cloud. We choose the term cloud to represent the approximation of the expected features
of the next data point, which in summary, includes position, a range of probable values and recent tendencies of direction.
The Living Algorithm generates a trio of ongoing descriptors in response to the ongoing flow of information in the data stream. These
descriptors create predictive clouds. These meaningful composite elements, these predictive clouds, may be the type of predictive tools that
Dr. Zadeh suggested would be necessary for coping with the analysis of biological systems. Dr. Zadeh argued that these predictive tools
would be, of necessity, 'fuzzy or cloudy quantities which are not describable in terms of probability distributions'. Dr. Zadeh pursues a
solution that applies new mathematical abstractions to what he calls fuzzy sets. In contrast, we pursue an approach that applies existing
mathematical tools to the notion of a living data stream.
These predictive statements inherit their cloudy nature from the constant state of evolution inherent in a data stream. The Living Algorithm
System digests data to provide a predictive cloud, whose shape shifts with each new entry. Each new data point represents change; and the
constant possibility of change requires an ongoing approximation of pattern that is central to the responsiveness of living systems. As with
Life, these predictive clouds are context sensitive, constantly evolving via the dynamic input from a living data stream. The urgency of
response typically required of living systems demands context sensitivity. These predictive clouds reflect the immediate nature of living
systems, as they move through time. As such, the ongoing and suggestive nature of the Living Algorithm's predictive cloud is ideal for
describing the changeable and immediate nature of living systems.
After accomplishing her last ordeal, the Living Algorithm has now fulfilled all of the previously stated job requirements. The Living
Algorithm is the ideal candidate for the position of representing the personal and dynamic nature of living systems. Her mastery of data
The Mathematics of Living Systems Page 8

Algorithm is the ideal candidate for the position of representing the personal and dynamic nature of living systems. Her mastery of data
stream mathematics renders her approach a powerful simulation of an animate system.

Nature of Evidence: Living Algorithm as Life's Information Digestion System?


Let us summarize our findings and then examine some of the intriguing implications. One essential fact about living systems is that they are
in a constant state of digesting data streams. The responsiveness of an organism to a data stream requires some form of information
digestion that serves these notable functions:
1. relates the present moment to the immediately preceding moments,
2. reveals patterns (or the lack thereof) that represent these related moments
3. generates rough, yet practical, predictors about the immediate future.
The Living Algorithms unique process of digesting data streams generates the Predictive Cloud. The Predictive Cloud satisfies the three
notable functions outlined above. Accordingly, the Living Algorithm System mirrors these essential qualities that living systems require and
as such is a compelling model of biological information digestion.
Could the Living Algorithm be more than a model? Could the Living Algorithm model actually be the method by which living systems
process data streams? Does this method of information processing exhibit any noteworthy patterns? If noteworthy patterns do emerge, what
rules are capable of generating these forms? Is it reasonable to assume that patterns must conform to some manner of rule-governed
behavior?
What form might these rules take? What language is being spoken? We sense that the ideal language must be mathematical. What other
language is capable of fulfilling the unique requirements of this challenging job description? Is the mathematics of data streams a language
of living systems?
In order to answer these intriguing questions, we must start by discovering if there are any patterns worth noting. If we discover such
sequences, we can then begin to search for the rules that govern them. We can then ask: what sort of language can express the rules that are
required to generate these noteworthy patterns? And finally: are humans, specifically, or living systems, in general, subject to the grammar
of this language?
Let's take a stab at some preliminary answers. Does the Living Algorithm exhibit any noteworthy patterns? Both the Pulse of Attention and
the Triple Pulse are innate patterns of the Living Algorithms method of information processing. These mathematical patterns obey some
distinct rules, as discussed in Triple Pulse Results. Is there any evidence that living systems are subject in any way to these rules? In the
prior article stream, we examined multiple examples of how the mathematical rules of the Triple Pulse sync up with multiple
experimentally verified sleep-related phenomena. Humans, at least, seem to be subject to the rules of the Triple Pulse. Is all of this evidence
coincidental? Are there other factors at work? Is there a confounding variable that is waiting to be discovered? Or do these correspondences
indicate that we might employ the Living Algorithm to digest information?
If the Living Algorithm is really one of the ways in which living systems digest data streams, the Living Algorithm must have evolutionary
potentials as well. Else why would this computational ability be passed on from generation to generation? For a discussion of these issues,
check out the next article in the stream The Living Algorithm's Evolutionary Potentials.

The Mathematics of Living Systems Page 9

Living Algorithm's Evolutionary Potentials


29 November 2015
08:34 AM

Could the Living Algorithm possess Evolutionary Potentials?


2: Articles
3. Sections
4. Paragraphs

Living Algorithm?
A special equation whose sole function is digesting data streams.
The Living Algorithm's digestive process provides the rates of change (derivatives) of any data stream. These metrics/measures contain
meaningful information that living systems could easily employ to fulfill potentials, such as survival.
Living Algorithm System?
A mathematical system based in the Living Algorithm's method of digesting data streams.
In the prior monograph, the Triple Pulse of Attention, we saw that the mathematical behavior of the Living Algorithm System exhibited
patterns of correspondence with many aspects of human behavior. Specifically, the Living Algorithm's Triple Pulse paralleled many sleeprelated phenomena. This intriguing synergy between math and scientific 'fact' led us to ask the Why question. Why does the linkage exist? Is
there a conceptual model that could help explain this math/data synergy?

Life and the Living Algorithm are compatible in many ways. As such, the Living Algorithm is the ideal type of equation to model living
systems. Further, the Living Algorithm has many features that are useful to Life. Taking this line of reasoning a step further, we ask the
question: could it be that the Living Algorithm doesn't just model Life, but that living systems actually employ the Living
Algorithm's algorithm to digest sensory data streams? In other words, could the Living Algorithm be Life's computational tool?

Is there any evidence that Life employs the Living Algorithm to digest data streams?
The initial article in this monograph developed the notion that living systems require a Data Stream Mathematics that provides ongoing upto-date descriptions of a flow of environmental information. Life needs these descriptors to approximate the future. These approximations
enable living systems to make the necessary adjustments to maximize the chances of fulfilling potentials, including survival. This ability to
approximate the future applies to a wide range of behaviors everything from the regulation of hormonal excretions to the ability to
capture prey or escape from predators.
The prior article, The Living Algorithm System, argued that the Living Algorithm's Predictive Cloud provides viable estimates about future
performance. As mentioned, Life requires a mathematical system that provides these future estimates. In this way, the Living Algorithm
System fulfills this particular requirement for a mathematics of living systems. The existence of these talents provides preliminary support
for the notion that the Living Algorithm could be the method by which living systems digest data streams.

If the Living Algorithm is really one of the ways in which living systems digest data streams, could the Living Algorithm have evolutionary
potentials as well? Why else would this computational ability be passed on from generation to generation? If it is indeed a computational tool
of living systems, the Living Algorithm should also provide an essential mathematical backdrop that is crucial for the evo-emergence of
many of Life's complex features.
The Living Algorithm's Predictive Cloud is the collection of derivatives (rates of change) that surround each data point - each moment. This
feature is of particular significance because the Predictive Cloud consists of predictive descriptors. In the prior article, we saw that these
predictive descriptors could be very useful to Life on a moment-to-moment level. Could these predictions concerning environmental
behavior confer an evolutionary advantage as well? Is it possible that knowledge of the Living Algorithms Predictive Cloud could further
the chances of survival for the myriad biological forms? Does the Predictive Cloud provide an indication of the evolutionary potentials of the
Living Algorithm System?

We don't know. You must read on to find out.

Living Algorithm's Predictive Descriptors as applied to the Environment


Why may it be possible that the Living Algorithms predictive descriptors provide an evolutionary advantage?
The Living Algorithms predictive power is based upon the ongoing contextual features of the data stream. The content-based approach (the
raw data combined with memory) is certainly the simplest system. It requires no storage memory and has no computational requirements.
However, because the ongoing instantaneous nature of the information is devoid of context, employing the most recent instant to make
predictions about the future is like shooting in the dark. The likelihood of a hit is so small, as to approach the improbable.
The Living Algorithms digestion process provides crucial knowledge about the features of any data stream. This digestion process generates
ongoing, up-to-date rates of change (derivatives) that provide context. These contextual features provide predictive powers that are far
superior to that provided by sheer content alone (the raw data combined with memory).
The Mathematics of Living Systems Page 10

superior to that provided by sheer content alone (the raw data combined with memory).

The most basic of these features is the trio of central measures referred to as the Predictive Cloud. The Cloud's predictive power has many
uses. On the most basic level the Cloud provides information as to probable location, range and direction of the prey/predator. The Living
Average indicates the most probable location for the next piece of data; the Deviation, the range of variation; and the Directional, the
tendency and probable direction of the data stream. This trio of central measures provides incredible predictive power regarding the next data
point.

Could knowledge of the Predictive Cloud's metrics/measures regarding the ongoing flow of environmental data provide an essential
evolutionary advantage? Could an organism, whether cell or human being, make more accurate predictions about the future with an
understanding of these ongoing mathematical features of the myriad environmental data streams?
Lets explore some examples of the predictive power of the trio of measures that constitute the Predictive Cloud. An ongoing knowledge of
this trio of central measures would provide invaluable information to the prey in terms of probable range, direction, acceleration, and actual
location of a moving predator. Vice versa these measures would provide invaluable information to the predator in terms of the probable
location of an escaping prey. (The dot in the diagram indicates the probable location, the circle: the range, and the arrow: the direction of the
next data point.)

An organism could make a more efficient and effective response to environmental input with a knowledge of probable outcome. For
example, the ability to better predict location, range and direction of motion would allow the predator/prey to capture/escape more
frequently. It seems safe to say that the better the organism's predictions are, the greater the chance of survival. This would apply to any
organism. In short, the knowledge of the ongoing contextual features of any data stream enables any organism to make conscious,
subconscious or hard-wired choices that further the chances of survival.
The knowledge of probable outcome supplied by the Predictive Cloud also enables the organism to conserve energy. Instead of wasting
energy in the unguided attempt to procure food or sexual partners, the organism would only expend valuable energy when the Predictive
Cloud indicates a greater chance of success. Of course, energy conservation is a key evolutionary talent.

In addition to physical capabilities such as size and strength, it seems evident that the predator/prey evolutionary arms race would have to
include the computational ability to make probabilistic predictions about the future. Further the refinement of this essentially mathematical
skill has no end. While strength and size have limits imposed by physical requirements, neural development is virtually unlimited, as
witnessed by these words. The continuous refinement of this computing advantage, whether through experience, evolution, or emergence,
would enable the organism to maximize the impact of the response, while minimizing energy expenditure an essential evolutionary
ability. Of course this refinement of computational abilities could apply to the Living Algorithm's multitude of potentials.

Predictive Cloud: Computational Backdrop for Expectation-based Emotions?


As mentioned, an ongoing knowledge of the Living Algorithms Predictive Cloud, the aforementioned trio of measures, could be employed
as an invaluable predictive tool. On more complex levels, these same measures could easily supply the essential computational backdrop for
the development of our emotions. Let us offer some cursory remarks in this regard. The Cloud provides information that could enable an
organism to anticipate and prepare for the future. Anticipation morphs into expectation.

As mentioned, an ongoing knowledge of the Living Algorithms Predictive Cloud, the aforementioned trio of measures, could be employed
as an invaluable predictive tool. On more complex levels, these same measures could easily supply the essential computational backdrop for
the development of our emotions. Let us offer some cursory remarks in this regard. The Cloud provides information that could enable an
organism to anticipate and prepare for the future. Anticipation morphs into expectation.

The Mathematics of Living Systems Page 11

In brief, the Clouds measures of data stream change are emotionally charged because they determine expectations concerning the future.
The investment of emotion into information, whether memories or data, has an evolutionary purpose reinforce memory.

This emotional content renders the information easier to remember. This is not mere speculation. Cognitive studies have shown that memory
and emotions are linked. Information's meaning is invested with emotion because it is relevant to our existence. As such, a random set of
numbers is difficult to remember.

Its clearly evident that the Living Algorithms Predictive Cloud could provide an obvious evolutionary advantage to living systems. The
accuracy of future estimates is increased via the application of a simple and replicable algorithm. The Cloud's estimates of future
performance could be employed to predict environmental behavior. Better predictions increase the efficiency and effectiveness of our energy
usage. This conservation of energy certainly provides an evolutionary advantage. Further, the Cloud's trio of predictors could easily generate
the expectations that are the base of many emotions. Emotions have an evolutionary purpose, as they are associated with heightened
retention and recall associated with memory.

This discussion suggests that it is in Life's best evolutionary interests to have knowledge of the 3 ongoing and up-to-date measures that the
Living Algorithm provides. But to have access to this predictive power, Life must employ the Living Algorithm to digest data streams.

Living Algorithm provides Sense of Time that Life requires.


Besides predictive power, the Living Algorithm's method of digesting information also supplies a sense of the passage of time. The Living
Algorithm's repetitive/iterative process relates past data to the current data, with the present being weighted the most heavily. This relating
process, which merges the past with the present, confers a sense of the passage of time. Similarly, a cartoon consists of a series of related
images.

The sense of time passing is important to living systems for multiple reasons. A primary reason is that the flow of digested sensory
information only makes sense over time. Time duration is required to derive meaning from individual sensations. Isolated sensory input
makes no sense by itself. For instance, an isolated sound without temporal context is neither music nor a word. Even a picture takes time to
digest, no matter how brief. A sustained image over time is required to identify objects. The sense of smell, supposedly the first sense,
requires a duration of some kind to differentiate the potentially random noise of a single scent from the organized meaning of a sustained
fragrance.
A sense of time is required to experience the meaning of a signal. If an organism existed in the state of sheer immediacy, it would
automatically respond to environmental stimuli tit-for-tat just as matter does. But to make any kind of sense out of a sensory
message, the organism requires an elemental sense of the passage of time. The organism must be able to pay attention to the
sensory translation for a sufficient length of time to determine if the message indicates food, foe, or sexual partner. Other wise
the raw sensory information is just random garble. It is evident that the ability to sense the passage of time is essential i f we are
to experience the information behind sensory input.

Further, when an organism must make choices based upon sensory input to maximize the chances of survival, a sense of time is required to
even begin comparing alternatives. It seems that a sense of time is not just an evolutionary talent, but a requisite talent for all living systems.
For an organism to both have a sense of time and make educated guesses about the future, it seems that living systems must have emerged
with some sort of computational talent. This computational talent could be employed to digest the sensory data streams that are in turn
derived from the continuous flow of environmental information. The Living Algorithm's method of digesting data streams provides both
future estimates and a sense of time.

If the ability to digest information and transform it into a meaningful form is indeed an essential characteristic feature of living systems,
could the Living Algorithm and Life have emerged from the primordial slime together?
The Mathematics of Living Systems Page 12

could the Living Algorithm and Life have emerged from the primordial slime together?

How Living Algorithm mathematics supplies a sense of time.


How does the Living Algorithm provide a sense of time?

The Living Algorithm's digestion process generates a sense of time by merging and relating the present moment with past moments.
How is this blending of past and present accomplished?
The impact of each data byte decays over time. This process is illustrated in the graph at right. Each color swatch represents the impact of an
individual data bit as it decays over time. The x-axis represents 180 repetitions of the Living Algorithm's mathematical process. Notice how
each moment includes many colors. This indicates the impact of prior and current data bits upon the current moment.
How is decay incorporated into the mathematical process?
The Living Algorithm's Decay Factor supplies this function. Let's see how.
The senses digest continuous environmental input to transform it into digital form. However this digital form has no meaning. Each byte of
information is isolated from the rest. At this point in the digestion process, the Decay Factor = 1, which means that there is no decay. With
no decay there is no relationship between the data points. With no relationship, there is no sense of time. Without a duration of time the
sensory output - the translated environmental information makes no sense. In summary, when the Decay Factor is one (D=1),
there is no time, hence no meaning.

To provide a sense of time, hence meaning, the sensory data streams require another level of digestion. The senses transform environmental
information into sensory data streams. The Living Algorithm's digestion process relates the isolated points in the sensory data stream to
create a sense of time. This relating process automatically occurs when the Decay Factor is greater than 1 (D>1). The Living Algorithm's
digestion process generates a sense of time passing, which simultaneously imparts the potential for meaning.

To aid retention, let us summarize this important process. Our senses digest continuous environmental input transforming it into sensory
data streams. The isolated instants of these sensory data streams don't have any inherent meaning, because they are not
related to each other in any way, as there is no decay (D=1). The Living Algorithm digests sensory data streams. This digesti on
process relates the isolated instants by introducing decay (D>1), which generates a sense of time. A sense of time is an essence of
meaning.

This analysis suggests that it is at least a plausible proposition that the Living Algorithm's digestion process could create the sense of time
that is required for meaning. Because living systems must derive meaning from data streams, this lends further support for the notion that
Life employs the Living Algorithm to digest data streams.
As an aside, material systems do not derive meaning from data streams. As such, material systems only deal with information that is inert. A
data stream's information is inert when the Living Algorithm's Decay Factor is one (D = 1). Conversely, a data stream's information is
dynamic when the Decay Factor is greater than 1 (D>1). Living systems require dynamic information because it yields meaning. As such,
this significant difference between material and living systems is inherent to the Living Algorithm's method of digesting data streams.

The Mathematics of Living Systems Page 13

Living Algorithm's Data Stream Acceleration as a Noise Filter


The prior section illustrated how the Living Algorithm could very well provide the computational backdrop for our sense of time. A sense of
time is required to derive meaning from sensory input and enable us to compare alternatives. Besides providing the sense of time that imparts
meaning to our senses, the Living Algorithm also calculates a data stream's acceleration. In fact, two of the Predictive Cloud's measures are
accelerations. Besides providing plausible estimates concerning future performance, knowledge of a data stream's acceleration could enable
an organism to filter out random data streams, as meaningless noise. Put another way, data stream acceleration enables an organism to
differentiate random noise from a meaningful signal.

The ability to differentiate a random from an organized signal is due to a simple mathematical fact. The random data streams associated with
background noise possess an innate and stable velocity, but no acceleration. Conversely, an organized data stream (a string of relatively
stable values consistent with ordered environmental input) has a distinct acceleration.

The graph at right exhibits this distinct difference between a random and an organized data stream. The big red curve represents the
acceleration of an ordered data stream, the classic Pulse of Attention (120 ones). The erratic green color represents the acceleration of a
random stream of zeros and ones. It is immediately apparent that the acceleration of the organized data stream overshadows (rising 3 times
higher) the acceleration of a random data stream.
Why is identifying random streams a significant talent? After the senses translate continuous environmental information into sensory data
streams, it is essential to first pare out superfluous data streams from consideration. Differentiating random from organized signals is the
initial step in the process. This filtering process prevents information overload. The ability to identify and ignore random data streams
eliminates an abundance of environmental information from consideration. Minimizing the number of data streams under consideration
maximizes the speed and efficiency of response, which of course conserves energy.
The focus upon data stream acceleration as a way of filtering out random signals has other advantages as well. Paying attention to data
stream acceleration enables frogs to conserve their energy by only shooting their tongues at bugs, rather than plants. Insects move erratically,
and hence with more data stream acceleration, than plants. On more complex levels, focusing upon data stream acceleration allows complex
life forms to determine changes in their environment. Perceiving environmental changes, whether auditory, visual, olfactory or tactile, is
essential for any organism that must detect an approaching predator, prey, or sexual encounter. Organisms with this sense must somehow
have the ability to perform calculations that determine probable quantities that differentiate the random noise of the background environment
from the significant accelerations of predator and prey. The simple Living Algorithm supplies the ability to perform these calculations
relatively effortlessly.
It seems that the Living Algorithm-derived random data stream filter could be employed to diminish the amount of incoming data, hence
prevent information overload. This same filter could also be employed to identify environmental changes. Both of these computational
talents provide an evolutionary advantage.

Living Algorithm System has the streamlined operations that Evolution prefers.
The Living Algorithm System provides predictive capabilities. Further, the Living Algorithm's method of merging/relating the past and the
present generates a sense of the passage of time. Living systems require a sense of time to derive meaning from the sensory data streams.
Finally the Living Algorithm computes data stream acceleration. Knowledge of data stream acceleration could enable an organism to
differentiate random from meaningful signals and identify changes in the environment. Each of these talents provides an evolutionary
The Mathematics of Living Systems Page 14

differentiate random from meaningful signals and identify changes in the environment. Each of these talents provides an evolutionary
advantage.
Besides providing these advantages, the Living Algorithm satisfies evolution's simplicity requirement. Evolutionary processes select for
efficiency and simplicity in order to streamline operations and avoid the corruption of complexity.

The Living Algorithm System satisfies these demands in three ways.


1) There is only one formula or algorithm. The Living Algorithms pattern of commands can be employed on multiple levels to
calculate an endless array of predictive measures. In other words, a simple replication of one algorithm provides an abundant source of
predictive data streams.
2) The necessary computations are quite simple only employing the operations of basic math.
3) The Living Algorithm measures require very little storage space. Rather than storing exact memories of any kind, only the ongoing,
digested measures are stored. The present is continually incorporated into the digested past, with the most recent environmental input
having the greatest impact. Rather than a complete motion picture, only the most recent composite quantity is stored - discrete pieces of
data networks versus a continuous data flow. Movies or music require far more storage capacity than numbers and letters, as anyone
knows who attempts to download CDs and DVDs onto their personal computer. Even static pictures require far less storage space. Due
to the contextual nature of the information, the storage and retrieval is relatively simultaneous.

Besides providing computational talents that are crucial for survival, hence provide an evolutionary advantage, it seems that the Living
Algorithm System also satisfies the requirement of simplicity. The principle of conservation dictates that Life requires a data digestion
system whose features are as economical as possible. This simplicity minimizes breakdown, data corruption, computational and memory
requirements. The Living Algorithms algorithm is simple; computations are basic; and memory requirements are minimal.

Preliminary Comparisons: the Living Algorithm vs. Probability, Physics & Electronics
We've seen many ways in which the Living Algorithm's Information Digestion System could provide essential evolutionary talents to living
systems. How do other methods compare? Let us offer some preliminary remarks in this regard.
As seen, the Living Algorithm satisfies the simplicity requirement for living systems in terms of computation and memory. In contrast,
Probability has prohibitive computational and memory requirements. No economy whatsoever. This dooms Probability as a computational
tool for living systems. To provide predictive descriptors, Probability requires many complicated formulas and operations, not to mention
the necessity of a huge and precise database. Further, Probability only makes predictions about the general features of the set, not the
ongoing moments. This method also weights past and present data points equally. While Probabilitys descriptors provide estimates of the
future, these estimates are based upon what was, rather than what is happening now. Besides being more economical in terms of operations
and memory requirements, the Living Algorithm System provides more up-to-date predictions about the future than does Probability.
Electronics provides another system that is in the running for the position as Lifes computational tool. Electronic data processing is different
in many fundamental ways from the type of data digestion provided by the Living Algorithm. The function of electronic data processing is to
transmit environmental or internal input as accurately as possible through noise reduction. Shannon, the father of information theory, studied
this type of processing regarding the clarity of electronic transmissions, such as radio, television, computers, and spacecraft. Accuracy is of
utmost importance in electronic transmissions, as Internet users well know. In this case, the internal codes that are required to ensure
accuracy only predict the most usual forms that the message could take as a type of redundancy testing. This type of data processing
requires standards of expectations to establish redundancy patterns. However, this method does not provide any predictive
abilities no estimates concerning future performance and certainly no room to move. Further, electronic information
processing does not relate the data. Without a relation between the past and the present, there is no sense of time pass
passing. Hence, electronic information has no meaning. Electronics imparts accuracy; humans impart meaning. And the Living
Algorithm provides the type of information digestion to impart that meaning.
Physics provides yet another alternative for determining living behavior. Given the initial conditions, say the Big Bang, and the appropriate
equations, Physicists can precisely predict the behavior of material systems. If living systems have no ability to adjust to external
circumstances, and instead respond automatically to environmental stimuli, then Physics is still in the running for the position as Lifes
computational tool. If, however, living systems have the ability to make adjustments that facilitate survival, then they need a mechanism that
will provide predictive powers and a sense of time. If this is the true state of things, then Physics is out of the running, as all operations are
automatic. We will deal with these topics in more depth in the article on Informed Choice.
The Mathematics of Living Systems Page 15

automatic. We will deal with these topics in more depth in the article on Informed Choice.

Summary
The Living Algorithm is an ideal evolutionary tool due to its predictive capacity, minimal memory requirements and ease of use. The Living
Algorithms Predictive Cloud easily characterizes the moment and provides pragmatic estimates about the immediate future. Articulating the
relationship between moments reveals patterns. Moment-to-moment updating discloses changes in these same patterns. Both are crucial
abilities for any organism. The Living Algorithm provides both of these functions. If living systems employ the Living Algorithm to digest
environmental data streams, it seems reasonable to assume that this would impart a huge evolutionary advantage.
Besides this predictive capacity, the Living Algorithm's relating process, whereby past information is related to current information, provides
living systems with a sense of time. A sense of time is a requisite talent for interpreting the sensory input from the environment. Without the
ability to experience this sensory information over time, the environmental input becomes meaningless noise. Without access to meaningful
information, an organism cannot respond effectively to environmental stimuli and perishes. Besides separating living matter from inert
matter, a sense of time provides a serious evolutionary advantage.
Finally, the Living Algorithm's information digestion process also generates the acceleration of any data stream (one of the features of the
Predictive Cloud). Data stream acceleration could easily provide the computations that enable an organism to differentiate a random from an
organized signal. This talent diminishes the possibility of information overload. Eliminating superfluous information from consideration
certainly provides an evolutionary advantage. As a significant side benefit, knowledge of data stream acceleration also enables an organism
to identify environmental change. Identifying change could also signify the need for an appropriate response, another significant
evolutionary advantage. It seems evident that if Life employed the Living Algorithm's digestion process that it could provide a multitude of
evolutionary advantages.

To further illustrate the pragmatic nature of the Living Algorithm's Predictive Cloud, the next article in the stream explores a concrete
example from the sport of baseball the batting average. In the process we will see how Probability & Living Algorithm
are Complementary Systems.

The Mathematics of Living Systems Page 16

The Batting AverageLiving Algorithm vs. Probability


29 November 2015
08:36 AM

Probability Digests the Baseball Hitter's Data Stream


2: Articles
3. Sections
4. Paragraphs

In the preceding article, we showed how the Living Algorithm System is the ideal mathematics to deal with Life's data streams. The Living
Algorithm is sensitive to the moment and weights each data point in the stream according to its relation to the present. Further the Living
Algorithm's predictive cloud also describes the trajectory of the moment's recent trends. This up-to-date information about the moment
provides estimates about the nature of future moments.
To illustrate these concepts, let's explore a concrete example. The baseball players actions during a game can be characterized by any
number of data streams. One of the most basic measures of a players performance at the plate is the batting average. The batting average is
determined by one of these data streams. The rules that generate this data stream are simple. If he gets a hit, he generates a one; and if he
doesn't get a hit, he generates a zero. There are several kinds of at bats that are excluded from the data stream that determines the batting
average (e.g. walks, batters hit by pitches, and sacrifices).
Probability looks at this flow of information as an ever-growing set and computes an average (the mean), which is appropriately called the
batting average. This statistic has had a significant role to play in the evaluation of the success of any professional hitter. Raw numbers such
as hits, home runs and RBIs are also significant measures of success, but the batting average has been a traditional indicator that compliments
these raw numbers. Baseball players' salaries and fame are based, in part, upon these batting averages.
These batting averages, which describe a players' performance, can also be used to serve a predictive function. The knowledge of a players
batting average is likely to shape the strategy of opposing coaches and pitchers. Owners and general managers also use the batting average to
predict how well the player will do in the following year(s). Bonuses, salaries, and long-term contracts are also likely to be influenced by a
player's batting average.
It is easy to see from this example how descriptive measures of past performance, such as the batting average, are used to predict future
performance. It is equally obvious that these predictive descriptors only provide very rough approximations of future performance. Even
though a baseball player might have a batting average of .333, this does not in any way guarantee that he will continue to bat .333 for the rest
of his season, contract, or career. This obvious lack of guarantee reminds us that the batting average is only a rough approximation of future
performance.
The rough approximation of the future provided by a batting average is valued by those who have a stake in predicting future events. Large
salaries are given because of these rough approximations; huge bets are placed on them; and strategies are formed on these predictive
descriptors called batting averages. It is evident that, despite their relative imprecision, these guesstimates are extremely meaningful to the
world at large.
Let's see what happens when the Living Algorithm processes this data stream not as an extended fixed set, but as an ongoing stream.

Living Algorithm digests the Baseball Player's Data Stream of 'at bats'
This analysis might suggest that the Living Algorithm is nothing more than a subset of Probability. In the ensuing discussion, we hope to
illustrate that the Living Algorithm is a unique approach to data analysis. Rather than being a subset, the Living Algorithm appears to be a
valuable complementary approach. The Living Algorithm digests the exact same data stream the baseball player's 'at bats'. From this
data stream, the Living Algorithm generates a predictive cloud, consisting of the previously mentioned trio of descriptors. This
predictive cloud describes the context of each moment in the player's career. Instead of characterizing the entire stream of 'at
bats' as an enlarged fixed set, the Living Algorithm characterizes the changing pattern that results from a constant focus on the
most recent at bats. While Probability weights each data point (each 'at bat') equally, the Living Algorithm assigns the greatest
weight to the most recent 'at bat' (data point), and scales the rest of the 'at bats' in descending order from the present.
Accordingly, the Living Algorithm's predictive cloud is context sensitive, adjusting to recent 'at bats' and providing predictive
information about the next 'at bat'.
Because of this context sensitivity, the Living Algorithm's predictive cloud provides up-to-date, relevant information as to the character of the
next 'at bat'. The trio provides information about the hitter's current state of affairs the position, range of variation, and direction of the
momentum of the batter's hitting data stream. This information could lead to the following scenario. The hitter's batting average
for the year is .333 (Probability's mean average). Complementing this knowledge, are the insights provided by the Living
Algorithm his current weighted average of his most recent at bats is .375, with a range of .20 and a positive direction of .10.
The Living Algorithms predictive cloud indicates that the batter is 'hot' right now. Recently, his weighted average of .375
exceeds his overall batting average of .333. In addition, he has been very consistent in recent at bats, as indicated by the tight
range of variation (.20). Furthermore, his recent batting data stream has a positive momentum (+.10). These descriptors of the
current state of affairs provide rough approximations of the immediate future. This information can be exceedingly relevant to the
opposing pitcher and his coaching staff.
In contrast, a hitter might have the same batting average for the year of .333, but the predictive cloud could indicate his weighted average
is .285, with a range of .80 and a negative direction of .20. This indicates that the batter is currently 'cold'. His weighted average of .285 is
less than his overall average of .333. His performance is erratic, as indicated by the large range of variation (.80). Furthermore his current
hitting momentum is negative (.20). In this case, the Living Algorithms predictive cloud would provide data that could lead to a
very different set of strategies for the opposing pitcher and his coaching staff.
In both scenarios, the batting average generated by Probability remains the same. Yet, these two different hypothetical moments in the
The Mathematics of Living Systems Page 17

In both scenarios, the batting average generated by Probability remains the same. Yet, these two different hypothetical moments in the
hitters data stream suggest that there are two very different patterns at work. The Living Algorithm reveals these diverse patterns by
providing unique information about the data stream of a hitter that is both timely and context sensitive. Could it be that what is relevant to the
data stream of a baseball player may also be relevant to the data streams of other living systems?
The use of Probability's mean average, the famous batting average, is certainly a better way to characterize the player's entire season than the
use of the Living Algorithm. Probability's general averages are adequate, providing a fairly accurate summation of annual talent. This
information is essential when determining annual awards (MVP) and the next year's rewards (salaries). However, it is equally certain that the
batting average for the entire season does not provide up-to-date information as to the hitter's status at the current time. For those who have a
stake in the current game, these general averages merely provide a diluted reflection of the player's present status. In contrast, the Living
Algorithm's predictive clouds provide up-to-date information that is extremely relevant to the manager, the pitcher, and even the betting
community. On the other hand, this up-to-date information, while relevant to the next game, loses its potency when applied to the entire year.
This example from Americas game beautifully illustrates how these two approaches to data analysis complement each other. Probability's
measures accurately characterize the fixed data set of the year, while the Living Algorithm's measures accurately characterize each baseball
moment by analyzing the dynamic data stream of at bats.

10-day Average vs. the Predictive Cloud


Parenthetically, we would like to note that statistics are not the only way an effective coach evaluates the momentum of player performance.
One of the intangible qualities of a talented coach is the ability to recognize when players are hot and when they are not. A keen intuitive
sensitivity regarding this changeable momentum can be a potent skill in an effective managers toolbox. Beyond intuition however, what else
does the baseball community do to address the notion of performance momentum? Coaches do rely on statistics to help inform their decisions
and to provide a reality check for their intuitive insights. Timely information may even have the capacity to stimulate intuitive insights. Lets
examine one typical approach that baseball utilizes to characterize srecent performance.
The baseball community clearly recognizes that a ball players seasonal average or career average is a big picture measure of performance.
When the baseball community wants to focus on the momentum of a players recent plate appearances, it generally takes a snapshot of a
recent series of at-bats. The 10-game batting average is typical of this attempt to complement big-picture seasonal statistics with an updated
sense of the pattern of recent performances.
The sports community intuitively understands that there is more at work than big picture averages. Their use of statistics, like the 10-day
batting average, is designed to reveal a recent piece of the picture. We applaud their efforts to recognize the importance of the present
moment. In fact, there are obviously times when a feel for the present moment is the most important thing to specific members of the baseball
community. Those whose business is game day strategy are necessarily concerned with the momentum of recent performances. This
contemporary insight is every bit as significant as overall seasonal averages to the baseball community. However, in the analysis that follows,
we shall claim that this solution is just a junior version of what the Living Algorithms predictive cloud offers the cave man model, as it
were.
Lets focus upon the nature of the 10-day average as an object of comparison with the predictive cloud. Essentially, a 10-day average offers a
very limited analysis of player performance. First, the 10-day batting average is a single measure of a fixed set, while the Living Algorithm
offers a trio of measures. Second, every at bat in a 10-day average is equally weighted, so that what happened 10 days ago is just as important
as what happened yesterday. And finally, the 10-day average snapshot requires extensive recordkeeping in contrast to the Living Algorithm.
Initially, we note that the Living Algorithms predictive cloud provides a trio of measures, while the 10-day average provides a single
measure. The trio indicates a batting average that: 1) reflects recent performance (similar to baseballs 10-day average), 2) an estimate of the
probable range of variation from the recent average, and 3) the direction of the momentum of the recent average. This trio of measures
provides a 3 dimensional perspective on a players recent performance, as contrasted with the 1 dimensional perspective of the 10-day
average.
Not only does the Living Algorithm provide 3 measures for the price of 1, but, its focus is more sharply attuned to the present moment. When
we examine the single measure provided by the 10-day average, we find a second significant area of contrast. The Living Algorithm
addresses the concept of a 10-day average in a different manner. While the 10-day average weights each at-bat equally over a 10-game
stretch, the Living Algorithm assesses each at-bat on a sliding scale that gives the greatest weight to the most recent at-bat. While the
traditional 10-day average considers an at-bat from 10 days ago to be as significant as an at-bat yesterday, the sliding weighted scale
considers the most recent at bats to be a better indication of the pattern that incorporates the present moment. In essence, the Living
Algorithms weighted focus provides a constantly updated sense of the pattern of recent player performance in a way that the more
generalized focus of the 10-day average does not .
In addition to providing 3 measures whose weighted focus is more attuned to the present moment, the Living Algorithm also provides a more
user-friendly approach. The traditional approach to computing a batting average, regardless of whether it is a seasonal average or a 10-game
snapshot, requires a database that includes each individual at-bat. Each new at-bat in the 10-game snapshot requires constantly adjusting the
members of the set by adding the most recent at-bat to the database and removing the now out-dated at-bat. In contrast, the Living Algorithm
integrates the current at-bat into her updated, ongoing weighted averages the trio of measures that constitute our predictive cloud.
Once integrated, the raw data has served its function, and has no further meaning or importance. The simplicity of the Living
Algorithms approach to recordkeeping requires no database, and therefore, contrasts favorably with the raw data requirements
of the traditional 10-day average.
The 10-day average, while focusing a little more closely upon recent events, is still stuck in Probabilitys paradigm. This paradigm makes
estimates about the future based upon general statements about a fixed set. By viewing the data stream as an extended fixed set the traditional
paradigm ignores important potential information. In contrast, the Living Algorithm is designed to access these storehouses of potential
information. In essence, the Living Algorithm digests a stream of data by relating a sequence of numbers to each other in an evolving,
dynamic manner.
The Living Algorithm provides a trio of measures to mine this untapped potential information. This evolving trio consists of the following
The Mathematics of Living Systems Page 18

The Living Algorithm provides a trio of measures to mine this untapped potential information. This evolving trio consists of the following
ongoing descriptors: 1) batting average, 2) a range of variation, and 3) a description of recent trends (momentum). These evolving measures
weight the data stream of at-bats on a sliding scale according to their proximity to the most recent data point. Further, the Living
Algorithms simple algorithm (procedure) is more user-friendly than the unwieldy use of the 10-day average. Rather than relying
on a database that consists of all relevant at-bats, the Living Algorithm requires only the memory of evolving measures that
characterize the most recent player performance.

Living Algorithm & Probability: Complementary Systems of Analysis


These measures, not only describe the characteristics of the entire season or of the dynamic moment, but also provide predictive information.
The predictive information contained in either of these measures does not satisfy conventional mathematics demand for predictive rigor.
However, these estimates of future performance are extremely relevant to those making decisions about the next season or about the next
game. Probability distributions allow scientists to set the confidence limits that enable them to make precisely defined predictions about fixed
sets. These probability distributions, where each known element is weighted equally, are however, helpless before the dynamic nature of the
unknown future. Even though Probability computes the batting average, it can't set the confidence limits for this average at a level that would
satisfy traditional standards of predictive rigor. A baseball players performance is too idiosyncratic to create the population size
requirements of traditional Probability mathematics. This problem regarding the idiosyncrasy of living data streams suggests that the batting
average may very well serve as a representative example of living systems.
Instead of Probability's well-defined fixed-set predictions, living systems require rough approximations of the ongoing patterns of data
streams. These rough estimates enable a range of interpretation and response to the environment, which is represented by an inherently
changeable data stream. What these descriptive measures lack in predictive rigor, they more than make up for by focusing their predictive
relevance on the next moment of performance. Only the Living Algorithm provides the unique trio of predictive descriptors that enhance the
ability for the flexible interpretation and response that is essential for living systems.
Note: the new perspective of data stream mathematics is meant to broaden the current paradigm. The current paradigm, which has evolved
over centuries, has become very adept at the mathematics of fixed data sets. Our new perspective does not find fault with the descriptive and
predictive power of this traditional approach. We do, however, suggest that the traditional approach has limits to its explanatory and
predictive power. These limits lie in the inability to address the immediate and dynamic nature of the biological world. By addressing these
current limitations, a Mathematics of Data Streams appears to be a perfectly, necessary complement to the Mathematics of Data Sets. One
approach deals with the general and permanent features of fixed populations (fixed sets), while the new approach deals with the immediate
features of a dynamic data stream. The unique predictive descriptors of the two approaches (Probabilitys data sets and the Living
Algorithms data streams) provide a dual perspective that better encompasses the whole of inanimate and animate existence. Similarly
balancing the general/permanent components of fixed set mathematics with the changeable/immediate, components of a mathematics of data
streams, should provide a more complete and deeper insight into the nature of living systems.
Probability and Data Stream Mathematics of the Living Algorithm System do not merely represent a difference of degree, but rather a
difference in kind. To see why these two forms of data analysis are complementary check out the next article in the series Mathematics of
the Moment (vs. Probability).

The Mathematics of Living Systems Page 19

Mathematics of the Moment(vs. Probability)


29 November 2015
08:38 AM

Probabilitys Data Sets vs. Living Algorithms Data Streams


2: Articles
3. Sections
4. Paragraphs

There might be some who feel that the Living Algorithm is but a subset of Probability. On first glance, the two approaches to data analysis
seem exceedingly similar. Both Probability and the Living Algorithm employ averages and deviations to characterize data. Due to these
similarities, it would seem that both would follow similar technical guidelines and have a similar purpose. As such, one migh t consider
Living Algorithm mathematics to be just a branch of Probability. But, as we shall see, instead of being Probabilitys subject , the Living
Algorithm rules her own realm. While the two have similar tools, they have mutually exclusive, yet complementary, domains. Ea ch has a
unique purpose and field of action.
Each system analyzes data. Probability, however, processes Data Sets, while the Living Algorithm digests Data Streams. Data S ets are fixed
in size, and Data Streams are continually growing. More importantly, Probability is limited to providing information about th e general
features of the entire data set, while the Living Algorithm can only provide information about individual moments in a data s tream. In short
the mathematical perspectives have unique fields of action. In fact, each is incapable of perceiving data from the others pe rspective. These
differences are crucial to how each form of mathematics manifests their abilities.

Although there are some striking similarities between the two, each has measures that are unique to its system. For instance, the Living
Algorithm includes the Directional as a member of her Family of Measures. The Directional determines the tendencies of the da ta stream and
gives birth to the Liminals. Further, the ideal Triple Pulse depicts the Directional of a specific and significant data strea m. Accordingly, the
Directional is central to Information Dynamics. This measure is unique to data streams. Probability has neither a Directional nor Liminals in
his bag of tricks. Because Probabilitys data sets are static, they have no direction.
Similarly, Probability includes many measures and forms of analysis, which are perfect for analyzing the features of static d ata sets, but are
inaccessible to Living Algorithm mathematics. In general, Probabilitys measures reveal the general state of the Universe, wh ile the Living
Algorithm Measures reveal the individual nature of its Flux. Consequently their fields of action and the questions they inspi re are entirely
different. The Living Algorithm & Probability belong to orthogonal universes, which intersect in Human Behavior.

Field of Action determines Nature of Questions and Answers


The field of action (focus of attention) is incredibly important, as it determines the nature of the questions asked and ther efore the answers
obtained. Probabilitys field of action is comprised of static data sets whose members have identical features. The question that arises is:
What is the nature of these data sets and what is the relationship between them? Probability answers these questions by makin g definitive
statements about these attributes. In contrast, the Living Algorithms field of action (focus) is growing data streams. The p osed question:
what is the nature of the most recent moment in the data stream in relationship to what went before. Accordingly the Living A lgorithm
specializes in making suggestive statements about individual moments in the stream.
It is impossible to merge the perspectives. Inherent mathematical constraints prevent Probability from defining individual mo ments and
prevent the Living Algorithm from defining general features. They are mutually exclusive perspectives. Lets examine one of t he significant
differences between these two approaches to data analysis due to their differing fields of action.
Because Probability concerns himself with static data sets, his predictive power is well defined. For the population under co nsideration,
Probability determines the values of central measures (his averages) that are fixed, absolute, and never changing. This is be cause
Probabilitys field of action consists of fixed, non-dynamic data sets (populations). Invariable, exact, permanent results are the order of the
day. Probabilitys specialty is revealing the fixed features of any data set.
Because the Living Algorithm concerns herself with dynamic data streams, her predictive power is suggestive, not definitive i n anyway. That
would be impractical, as her predictive power is relative to a dynamic context that is ever changing. Dynamic, constantly cha nging input
leads to changing results. Impermanence is the order of the day. (Physicists everywhere are cringing.) The Living Algorithms specialty is
revealing the trends of the data stream under consideration the trajectories of the most recent moment. She accomplishes this task
by analyzing the way an information flow changes over time (the changing momentum of the data stream the dynamics).
Probabilitys field of action is the fixed State of the Universe, as it pertains to the permanent nature of fixed data sets. This form of analysis
applies particularly well to material systems, whose physical laws are constant and never changing. The Living Algorithm's fi eld of action is
the Universal Flux the evolving state of data streams. This form of analysis applies particularly well to living systems, whose
biological processes are constantly evolving.

Probability Permanent Features; Living Algorithm Changing Relationships


To illustrate these concepts lets look at a few examples. Probability can take a computational snapshot of a data set and ap ply the insights
The Mathematics of Living Systems Page 20

To illustrate these concepts lets look at a few examples. Probability can take a computational snapshot of a data set and ap ply the insights
that are derived backward and forward in time to every data set that shares the same characteristics. This is why his conclus ions are so
powerful regarding matter. These insights apply to every piece of matter that has ever existed because matter sets are unifor m. For instance,
water molecules have always been and will always be the same.
A fundamental reason that Probability has such a difficult time describing human behavior has to do with evolution, both soci al and
biological. Due this evolutionary nature, the data sets regarding human behavior are frequently not uniform with respect to t ime. People are
continually changing from birth to death and human culture is continually evolving. Consequently, the insights derived from o ne data set are
harder to apply to other data sets of humans. For instance, care must be taken when comparing young people with old people Africans with
Europeans modern women with Stone Age women or college students with the rest of humanity. Many reputable studies of
human behavior, which have been acceptable in all other regards, have been fatally flawed due to inappropriately applying the
results from one data set to another data set of seemingly similar nature. In contrast, since it is universally assumed that
electrons in all times and places have always been identical, it is appropriate to apply the results from one data set of ele ctrons
to another.
The Living Algorithm can take a computational snapshot of a moment in the dynamic changing scene of the data stream and also come up
with some definite answers (the Living Algorithms Predictive Cloud). But these definitive answers are immediately eroded by the incoming
data - the constantly changing external landscape. Accordingly, a snapshot of a data stream only applies to that moment. The snapsho t reveals
the ongoing relationship between data points, not the permanent nature of the set.

A Data Set of 1s vs. a Data Stream of 1s


To illustrate these concepts let's look at some specific examples to see what they reveal about the dual perspectives. Lets see what happens
when we employ the two approaches to analyze a data stream or set consisting of 120 consecutive 1s (the Creative Pulse data s tream).
Probability finds this data set incredibly boring. It is so simple that the Average and Standard Deviation can be determined instantly without
any calculations. The Average of this data set is 1 and the Standard Deviation is 0, as there is no change. In contrast, the Living Algorithm
finds this data stream so interesting that there is an entire computer experiment accompanied by a written notebook devoted t o it. Further, the
mathematical results are related to experimental findings regarding behavior.
For contrast, lets employ the two approaches to analyze the data set containing the height of every man between 20 to 30 yea rs old in the
United States circa 2011. Probability can tell you the average height, the expected height range, and even how many heights i t takes to create
a smaller representative sample that reflects the entire population. In contrast the Living Algorithm is helpless before this data set. Her field
of action is ordered data streams. The current example is an unordered data set. What could she possibly say about an unorder ed data set,
when her specialty is ordered data streams?
The central measures (averages) that Probability employs to characterize each data set are permanent never changing. In contrast the
Living Algorithm never characterizes entire data streams, only individual moments in the stream. Further the measures (the predictive
cloud) that the Living Algorithm employs to characterize each moment in the data stream are unique to that moment. They canno t be applied
to the data stream in its entirety. In relation to one another, they are in a constant state of flux - adjusting appropriately to each new data point.
Probabilitys averages apply to the entire set, while the Living Algorithms predictive cloud only applies to individual mome nts.
Furthermore, these clouds, as the name implies, evolve with each new data point. For instance, although the Creative Pulse da ta stream only
consists ones, each moment has a unique role to play with its own evolving predictive cloud. This provides the individual fin gerprint for each
moment in the stream. For Probability each 1 is just another 1 in the group no individuality whatsoever.
Just as data sets dont exist for the Living Algorithm, data stream moments dont exist for Probability. This has to do with the way each
processes information. The Living Algorithm relates individual instants (data points) in the data stream to create moments. P robability does
not recognize a relationship between individual points, except in relation to the general data set. Even when Probability ana lyzes the
relationship between two data sets, his proclamations are absolute and apply to the data sets as a whole, not to individual m oments. The sets
are related or not by a specific percentage. Probability says nothing about individual moments, as his field of action is dat a sets, not data
stream moments. The field of action determines the nature of the explanation. What can Probability possibly say about individ ual data points
when his field of action is data sets?
In contrast, the Living Algorithms field of action consists of data points in a stream. To indicate how important each data point is to the
Living Algorithm, lets examine what happens when a 0 replaces any of the 1s in the Creative Pulse sequence. This slight chan ge totally
transforms the internal landscape of the remaining predictive cloud. Because these transformations were so dramatic, a series of computer
experiments were run exploring these variations. (See Creative Pulse Review.) Even though the Living Algorithms predictive cloud doesnt
give any indication as to the state of the entire data stream, the predictive cloud says a lot about the relationship between individual moments.
In contrast, substituting 0s for 1s in Probabilitys data set changes its permanent nature, but has absolutely no influence u pon the future of
individual moments. Because moments dont exist in Probabilitys system there are no future moments to be affected. Probabi lity does not
care about the individual, except in relation to the whole. For Probability, the individual is subservient to the group.

This difference between the two systems is due to the way they process data. For example, each approach determines their resp ective measure
for range of variation in a similar, yet distinctly different, fashion. Both employ the same formula, but with one crucial di fference. To
compute the Standard Deviation, Probability relates the individual points to the general mean average of the set. On the othe r hand, to
compute the Deviation, the Living Algorithm relates individual data points to the Living Average of the preceding moment.
This discussion has clarified a few issues. Although the Living Algorithm and Probability have similar measures (averages and deviations),
each has a unique field of action. Probabilitys field of action is static data sets. He computes universal features of these sets with no attention
to the individual points, except as to how they contribute to the whole. The Living Algorithms field of action is individual moments in a
The Mathematics of Living Systems Page 21

to the individual points, except as to how they contribute to the whole. The Living Algorithms field of action is individual moments in a
dynamic data stream. She computes specific features of each moment with no regard for the set as a whole. Accordingly, their fields of action
are complementary, each with a unique perspective. Neither is a subset of the other. Probability specializes in determining t he general
features of fixed data sets, while the Living Algorithm specializes in determining the changing features of individual moment s in a data
stream. As well as her other monikers, the Living Algorithm System could also be deemed the Mathematics of the Moment.
It is evident that the Living Algorithm and Probability are complementary systems. However, the use of probabilistic measures is widespread
in the scientific community, while the Living Algorithm is not employed at all. Does this indicate that the Living Algorithm System has no
scientific validity? The Living Algorithm certainly provides interesting information about the moment. But does this informat ion have any
scientific utility? To explore the issues behind this question, read the next article in the stream General Patterns vs. Individual Measures.

The Mathematics of Living Systems Page 22

General Patterns vs. Individual Measures


29 November 2015
08:43 AM

Transient & Individual Data lacks Certitude that Science requires


2: Articles
3. Sections
4. Paragraphs

The previous article illustrated how the Living Algorithm specializes in characterizing individual moments in a data stream, and how
Probability specializes in characterizing universal features of fixed data sets. Weve also seen how one of Probabilitys mea sures, the batting
average, is employed in a predictive fashion. This average provides an estimate concerning future performance that is highly valued by the
baseball community. Note that neither Probabilitys nor the Living Algorithms estimates about a player's performance are gua ranteed. The
batter could fall into or break out of a slump at any moment. The measures provided by either system only indicate a probable performance,
not one that is predetermined. Further, it is impossible to even put a probable range onto the accuracy of either prediction.
Why cant the scientific community apply their rigorous tools to the batting average? And if the batting average has no scien tific value, what
scientific significance does the Living Algorithms Predictive Cloud have?
Probability can apply his averages and standard deviations to the growing data set of a players at bats. However, Probabil ity cannot apply
his more sophisticated statistical tools that measure the parameters necessary to determine predictive accuracy the precision of the
estimate. Specifically, he cannot apply the standard error of the mean that is necessary to set confidence levels. This is a fatal
flaw in any scientific study based in statistical analysis. The careful application of these tools is essential for publicati on in
scientific journals. In short, a baseball player's batting average is too individual and transient to have any scientific val ue.
To crystallize these ideas lets look at an example from our political world. The more thorough polls will survey 2000 random people from
the rolls of American voters to make estimates about election outcome. Pollsters then make statements like: If the election were held right
now, 45% of the population is likely to vote for Obama with a range of 5% in either direction. This is the practical applica tion of
Probability's measures. An average (45%) that characterizes the set is highlighted along with the range (5%). The range indic ates the
confidence limits of the estimate of voter preference.
Although Probability provides precise measures concerning the general features of the data sets of these opinion polls, the r esults only apply
to that moment in history. The qualifying statement, If the election were held right now, indicates how tenuous the predict ions are. Due to
the volatility of political conditions underlying voter preferences, the opinions of the voter set are continually subject to change. Accordingly
the intentions of a set of voters at one moment in time can only be loosely compared with the intentions of the set of future voters. Because
of the transient and individual nature of these polls, there have been many notorious examples of voter polls predicting the success of one
candidate when the other wins. Due to this lack in predictive accuracy, opinion polls are like the weather report; we pay att ention but dont
place too much stock in their predictions due to the constant potential for an abrupt change in conditions. Because it is imp ossible to
generalize the certainty about that data set to any other equivalent data sets, opinion polls have no scientific value.
Science requires a certain level of certitude combined with the ability to generalize analysis to similar circumstances. Each baseball players
performance is so individual and transitory that it is impossible to generalize the analysis from one player's set to another or even to the
player's future performance with any certainty. This doesnt take away from the pragmatic predictive value of the batting ave rage. It just
means that it is impossible to achieve scientific certitude. In contrast, Probability can generalize results from identical m achines. For
instance, a company could track the performance of 100s of identical cars (the same make and model) and employ Probability's talents to
make well-defined predictions about future performance. Similarly scientists can examine the effects of the identical drug on a signifi cant
number of humans in a certain stage of a disease and make some sound scientific estimates about the future performance of the drug.
However each player is so unique, the pitchers he is facing are so different, and the psychological pressure of the big game so variable, that
it is impossible to have identical humans and circumstances to compare statistics with.
Although Probability can accurately characterize a player's data set, he can't apply this analysis to the player's future wit h any certitude. This
is due to the transitory nature of a ball players life as he moves through time and circumstance. A ball players performanc e lays in an
unknown future. Aging, accidents, and illness are three common features of life that can have an unpredictable and abrupt eff ect upon the
living data stream of a baseball players at bats.
For instance, nobody would ever attempt to equate the hitting data set of 20-year-old home run hitter Barry Bonds with his hitting data set
when he is 40 years old. This is due to lifes inevitable aging process. This means that the data set of his performance must be continually
redefined. Because of the necessary redefinition the sets are different and hence arent comparable scientifically. Further n o one would ever
attempt to equate 20-year-old Barry Bonds' hitting data set with another 20 year olds hitting data set because of inevitable individual
differences. In short the hitting data set of 20-year-old Barry Bond can only be equated with itself. Barry Bond ages and no one is quite like
him. The data stream characterizes the change inherent to living systems. The individual and transient nature of a humans da ta stream
renders the analytic tools of Probabilitys fixed set mathematics incapable of establishing the certitude that Science requir es.

Living Algorithm's Patterns are scientifically significant, not the individual measures.
The same analysis applies to the Living Algorithms predictive cloud. Data streams are so individual and transient that it is impossible to
achieve the certitude that Science requires. However, the Living Algorithm's predictive clouds supply an abundance of practic al information
when applied to living data streams, as evidenced in our batting average example. It is a plausible assumption that living sy stems employ
this pragmatic tool, the predictive clouds, for assessing environmental patterns to best determine the most appropriate respo nse to ensure
survival. If Life employs the predictive clouds, then Life is also subject to the Living Algorithm's information patterns. In Triple Pulse
Studies, the first notebook in this series, we examined many examples of how Life has employed the Triple Pulse, one of the Living
Algorithm's information patterns, to organize human behavior regarding sleep. Accordingly, the scientific value of the Living Algorithm
System lies in its ability to reveal the underlying information patterns that motivate behavior.
The Mathematics of Living Systems Page 23

System lies in its ability to reveal the underlying information patterns that motivate behavior.
However the Living Algorithm System doesn't have the tools to establish the scientific certitude of these connections. Probab ilitys
analytical talents are required to verify, or at least establish the limits on the correspondences between human behavior and the Living
Algorithm's information patterns. Once again it seems as if Probability and the Living Algorithm represent complementary syst ems.
As complementary systems, Probability provides a mathematical analysis of the general nature of fixed data sets, while the Li ving
Algorithm provides a mathematical analysis of the individual moments in dynamic data streams. Further due to the fixed and ge neral nature
of Probability's analysis of sets, the results can also be generalized with a distinct measure of scientific certitude. In co ntrast, due to dynamic
and individual nature of the Living Algorithm's analysis of moments, the measures that determine the trajectories of individu al moments
cannot be generalized. Hence the individual measures generated by the Living Algorithm, while possessing great pragmatic valu e, have no
scientific value.
While the individual measures of the Living Algorithm have no scientific value, the Living Algorithm's method of digesting in formation
reveals patterns that seem to influence living behavior (Triple Pulse Studies). What is the nature of these patterns? In what manner do they
differ from the patterns that Probability reveals?
To understand these differences, the next article is a historical investigation of Probability's rise to the top of the subat omic world. Ironically,
the story of how Probability became famous as ruler of the subatomic world illustrates both his inherent strengths and weakne sses. Further it
pertains to why the Living Algorithm's dynamic nature is ideally suited to determining causality, while Probability's static nature is more
suited to description. As with other aspects of these respective systems, their talents are mutually exclusive. Read the next article in the
stream Description vs. Causality; Static vs. Dynamics, to see how Probability was able to patch up the gaps in the subatomic universe that
were left by classical Mechanics. Also see how Probability sets the stage for the Living Algorithm's entry onto the scientifi c stage.
Or perhaps you've tired of this endless exposition. For a fresh allegorical perspective reenter our alternate universe and re ad Probability's
Numbers vs. Living Algorithm Patterns.

The Mathematics of Living Systems Page 24

Dynamic Causality vs. Static Description


29 November 2015
08:44 AM

Probability, initially, just a computational tool, not a theoretical necessity


2: Articles
3. Sections
4. Paragraphs

At the end of the 19th century, the continuous equations of Mechanics (traditional Physics) reigned supreme. Many believed th at Mechanics
had uncovered all the universal laws of matter. It was recommended that young men pursue a different line of research because this field was
exhausted. These continuous equations, which delineated the universal laws of Newtons mechanics, could accurately account fo r the
behavior of matter down to the atomic level. The subatomic realm, which was to turn everything upside down, had yet to be rev ealed.
Due to their explanatory power, it was thought that these continuous equations accurately reflected the nature of Matter, the Universal
Substance. Some of the more enthusiastic claimed, and some still claim, that this power also extends to Life, the form of mat ter that is alive.
The implications of these continuous equations are basic. Space and Time are continuous and distinct. Cause and Effect is an unbroken,
instantaneous, automatic affair. Once the appropriate starting point (initial conditions) has been established, the appropria te equation, which
had already been derived, could accurately determine all future moments with great precision. In other words, the Universal F abric had no
tears or discontinuities. Everything moved as if it were some giant clock. It was a comforting, feel -good perspective on the nature of reality.
All behavior conforms to precise laws. No paradox or ambiguity. Nothing unknown. Everything has a scientific explanation.
At this point in history (the late 1800s) Probability was just a supporting actor. Just recently admitted to the exclusive Sc ience Circle, his
credentials were a bit dubious. Probability was still associated with gambling and its unpredictable laws of chance. Employin g Gaussian
distributions (the normal curve), Probability could accurately predict long-term patterns. This was his scientific utility. However, he could not
determine what would happen in the next throw of the dice. Because of his inability to make firm predictions, Mechanics looke d down on
Probability as a messy or wishy-washy mathematics.
As long as Mechanics was describing planetary position or the trajectories of cannon balls, he was on solid ground. The situa tion changed as
ordinary sized objects turned into collections of atoms and molecules. The computations were too overwhelming for the simplif ied version of
reality supplied by Mechanics. He had to rely on Probability to perform the computations regarding the atomic universe. This was a natural
progression for Probability, whose specialty is characterizing the features of fixed data sets. He had dealt effectively with data sets of identical
dice or coins. Probability could apply these same talents to identical atoms and molecules. Thermodynamics was the first to e mploy
Probabilitys talents extensively to precisely predict the behavior of hundreds of millions of atoms and molecules. Probabili ty became the
computational tool of choice when dealing with material systems containing quadrillions of equally weighted elements. He is a ble to make
exceptionally precise, practically exact, predictions about the behavior of gases. For instance, Probabilitys measures enabl e scientists to make
miraculously accurate statements about the behavior of oxygen molecules when they are heated.
Probabilistic uncertainty, of course, flew in the face of the deterministic worldview of Mechanics where everything could be predicted. Yet
the contradiction was easily resolved. If the positions and trajectories (the initial conditions) of all the atoms and molecu les could be
accurately mapped out, Mechanics could determine the next throw of the atomic dice. Probabilitys talents had just been emplo yed as a form
of approximation a computational necessity due to the sheer number of atomic particles in the process. Probability was not yet a
philosophical necessity. In theory it was certainly possible to calculate the behavior of the atomic particles with the laws and
principles of Newtonian dynamics. Probability was only necessary to make the calculations possible, not for theoretical
purposes. He was just a supporting actor a mere computational tool with no other significance.
Employing Probability is merely a practical convenience, Mechanics confidently asserted. This reliance on Probability in n o way impacts
our view of an orderly, continuous universe where everything is predetermined. My system describes the natural order perfectl y, so my
equations still reign supreme. There is no place in my system for Probabilitys uncertainty. At this stage in history scient ists could still
envision a world that consisted of distinct particles and waves automatically and continuously interacting with each other. T hen came the
electron.

Probabilitys ascendance to Ruler of the Microscopic (Atomic) World


In the late 1800s the scientific community had good reason to believe their quest to uncover the elemental building blocks of the Universe
was at an end. Most of the experimental evidence of the time indicated that atoms are the fundamental particles underlying ma terial existence.
The prevailing view was that these atoms were indivisible particles - microscopic grains of sand. When J.J. Thomson uncovered evidence for
the electron, many considered it to be the 'atom of electricity'.
Rutherford, Thomson's student, presented evidence indicating that electrons, instead of being a type of atom, are tiny partic les that orbit the
nucleus of an atom. Under this view the 'indivisible atom', the fundamental particle, is more space than matter. These findin gs flew in the face
of the current atomic paradigm and catalyzed a break with Thomson. Everyone was looking for a tiny brick with which the unive rse could be
constructed. Instead of a microscopic particle, there was this entity that was filled with emptiness - hardly the candidate they were looking for.
In contrast, Rutherford and his followers embraced the electron and proton as the latest indivisible particles. This viewpoin t was congruent
with traditional thinking. Space was still continuous. Waves and particles were still separate.
Neils Bohr, student of both Thomson and Rutherford, introduced quantum theory into the mix. Much of evidence concerning the e lectron
came from analyzing the spectrum of light that is given off by a hydrogen atom. One of the first conclusions was that a movin g electron gives
off light. On closer examination it was found that the spectrum of light given off by the electron was bunchy rather than con tinuous. There
were inexplicable gaps in the fabric. Bohr came up with a purely mathematical explanation for this phenomenon by employing Pl ank's energy
quanta. The model suggested that electrons could only be in distinct orbits, but nowhere in -between. Although the model fit the precise data
perfectly, the theory behind the model satisfied no one, including Bohr. It implied that space, rather being continuous, was actually
discontinuous, at least beneath the surface of the atom on the subatomic level. Then came World War I.
After the war, a primary focus of young Physicists was to resolve this heresy against Mechanics in a traditional manner. Ever yone, including
The Mathematics of Living Systems Page 25

After the war, a primary focus of young Physicists was to resolve this heresy against Mechanics in a traditional manner. Ever yone, including
Bohr, was convinced that electrons were microscopic particles moving through a continuous space and giving off light waves. T here must be
a novel perspective, as yet hidden, that will resolve the paradox of quantized space. The supposed solution came from a surpr ising direction.
To resolve the inconsistencies with this perspective, Schrdinger derived his famous equation that defined electrons instead as waves. This
solution was radical, but still hadn't challenged traditional notions of a continuous space and a single truth.
While his equation fit the increasingly precise data concerning electrons, hypothetical extensions of Schrdingers equation led to some
strange and impossible results. For instance, when certain conditions were introduced into the equation, atoms expanded to th e size of the
Pentagon. To resolve this seemingly paradoxical situation, Max Born successfully applied probability theory to the problem. H is solution
indicated that the motion and position of the electron could be more accurately characterized as a probability wave. Improbab le as the
solution seemed, Borns probability insight resolved the mathematical difficulties introduced by Schrdingers equation and f it the hard data.
But this resolution implies that the world on the outside of the atom is essentially different than that the inside of the at om - the one
continuous and certain, the other discontinuous and ambiguous a Quantized and Probabilistic Universe.
Borns solution, while it fit the facts, didnt resolve the question of why some experimental results suggested that the elec tron was a wave and
that others suggested that it was a particle. Exploring the mathematical inferences of quantum theory, Heisenberg, Bohr's stu dent, derived his
famous Uncertainty Principle. An electron's static position or dynamic movement can be measured precisely, but not both. This suggested that
an electron was either a wave or a particle depending upon the mode of observation. A simplistic version of this philosophica l earthquake
maintains that subjectivity is a factor in observation due to inescapable mathematical constraints.
The insights of Born and Heisenberg moved Probability to center stage in the quest to understand the essential nature of the Subatomic
Universe. Richard Feynmans insights into Quantum Electrodynamics sealed Probabilitys position as Ruler of Subatomic Particl es.
Feynmans insight was even more counter-intuitive. His formulas allowed scientists to calculate the behavior of subatomic matter to
unbelievable levels of precision. However, the basis for these equations considered that subatomic particles moved in all pos sible directions
simultaneously, including forward and backward in time. His computations revealed which of these directions wasnt canceled o ut by
contrary motion. Scientists now had to take all possible directions into account to make this probabilistic computation.

Probability's uncertainty at the heart of Subatomic, hence Physical, Universe


Probability's rise to preeminence was not without controversy. If the insights were true, it was now essential to think of th e Universe in a
probabilistic and paradoxical fashion. Due to Probability's innate nature, subatomic particle/waves weren't predictable on th e individual level.
Although Probability can make incredibly precise statements regarding large groups of electrons, he cannot predict the behavi or of an
individual electron or photon. He has a hard time relating to individuals (probably an Aquarius). A single photon, like a sin gle coin, can go
heads or tails with each flip of the experimental coin. The single photon's eventual resting place is just as random as flipp ing a coin. The
results concerning a group can be predicted extremely well, while the individual not at all. Probability's precisely defined predictions break
down when applied to individual bits of eternal matter.
Further, a direct implication of the Heisenberg Uncertainty Principle is that electrons and photons have no absolute identity . The observer
determines whether the subatomic particle is a wave or particle. The mode of perception determines the answer. There is no de finitive answer
as to the electrons ultimate nature. It is solely dependent upon the observer. The question determines the answer. There is no absolute truth.
These solutions placed uncertainty at the heart of the subatomic, hence the physical world.
To indicate how revolutionary the implications of these scientific findings were, a group of scientists, including Nobel Priz e winners
Schrdinger & Einstein, immediately came out in opposition claiming that there was still an absolute truth out there just waiting to
be discovered, which would resolve this core uncertainty. Appalled at this turn of events, Einstein made his famous statement ,
God doesnt play dice, and spent a considerable amount of his prodigious mental capabilities in the unsuccessful attempt to
resolve the ambiguity that now lay at the very heart of Physics. In essence, this prestigious group was defending the absolut e
theoretical and philosophical eminence of Mechanics from this internal revolution by the adherents of Probability. Certainty was
under attack and they were employing all the analytic tools at their disposal to prop up the falling empire.

Embracing inherent Paradox & Ambiguity


Another group of Physicists led by Neils Bohr embraced Probability's inherent uncertainty and even attempted to enshrine para dox and
ambiguity as inherent aspects of existence. Bohr had the credentials to lead the revolution, as he initiated the whole mess w hen he
successfully applied quantum theory to the subatomic world. In Copenhagen 1928 he presented his principle of complementarity. He
postulated that for subatomic theory to be complete, electrons and photons must be viewed as both waves and particles, even t hough they
were mutually exclusive states. This was the paradox.
On the subatomic level, the perspective of the observer determined the nature of reality, rather than vice versa. This upende d the traditional
perspective that there was an objective reality out there just waiting to be discovered. Truth was relative to perspective ra ther than absolute.
There were now two truths rather than one. The subjective implications were unacceptable to those who demanded one absolute o bjective
truth. This polytheistic perspective was particularly disturbing to Einstein, as it implied that there was a discontinuous te ar in his monotheistic
space-time continuum. He had tied the package together with a neat bow, only to find that there was an unexplainable hole in the fa bric. He
spent the remainder of his career in an unsuccessful attempt to repair the paradoxical gap.
Part of the problem in resolving the paradox had to do with the Heisenberg Uncertainty Principle. The mathematical principle stated that
knowledge of position and dynamics were inversely related in terms of the electron. In other words the more that one knew abo ut the position
of an electron, the less one could know about its dynamics. Further, since Heisenberg had derived his formula, the uncertaint y constant had
appeared in myriad diverse circumstances. This inner connectivity enhanced the credibility of the theory that there is an abs olute limit to our
knowledge. This viewpoint totally contradicted the prior scientific mindset that absolute knowledge was attainable. This atti tude was entirely
believable after Mechanics uncovered the universal principles that govern the atom. As the atom was believed to be the ultima te particle - the
building block of matter, this seemed to be the end of knowledge. With the advent of the electron, the facade of absolute cer tainty was
replaced with absolute uncertainty.
The Mathematics of Living Systems Page 26

replaced with absolute uncertainty.


Bohr took the implication of the Uncertainty Principle to a new level of ambiguity. He reasoned that if knowledge of position and dynamics
are inversely related, then description and causation are also inversely related. The more precise our description of positio n the less we could
know about the dynamics. Yet dynamics indicates the relationship between things. As such dynamics is intimately tied with cau sation and
meaning. The more precisely position is described, the less we know about causation - content versus process. Von Neumann, arguably the
top mathematician of the 2nd half of the 20th century, proved this mathematically.
This viewpoint totally contradicted the traditional certainty of Mechanics. His continuous equations combined with the initia l conditions
could precisely define the exact position and dynamics of any object, including atoms, at any point in time. By challenging t he possibility of
absolute knowledge with his principle of complementarity, Bohr also challenged the supremacy of the continuous equation as a philosophical
metaphor. Rather than reflecting the innate nature of the universal order, these equations were demoted to fancy models. Whil e an accurate
reflection of the atomic world, the continuity was filled with holes on the subatomic level. Look closely enough and paradox and ambiguity
emerge. It is no wonder that there was such an uproar the very foundations of our Western ethnocentric arrogance were being
challenged.

Mechanics & Probability wedded as Modern Physics


The uproar, however, died out relatively quickly. For one, Bohr's principle of complementarity only confirmed the traditional relationship
between Mechanics and Probability. Mechanics as a system of dynamics had supplied the explanatory power in terms of causation .
Probability as a static system had only supplied descriptions. Although Probability was required to plug the gaps, his tools are primarily
employed as a computational tool, not as an explanation of causation. Probability's main philosophical contribution is the in nate ambiguity
and paradox at the heart of matter. Although amplified by the findings regarding the subatomic universe, ambiguity is an inna te feature of the
games of chance that gave rise to the laws of Probability. Further, Gdel proved that for a system to be complete it must contain paradox.
Thus by introducing paradox Probability completed the system of Physics as an explanatory tool.
The following diagram illustrates the development of the relationship between Mechanics and Probability.

Mechanics provides both the explanations of causation and the computational power when objects are large enough to be seen wi th the eye,
such as planets and balls. Mechanics still provides the causal mechanisms when the objects are invisible to the naked eye. Th is includes both
atomic and subatomic particles. Probability is required to provide the computational power for the uncountable numbers of mic roscopic
particles. In the subatomic realm of electrons and photons, Mechanics requires Probability to fill in the gaps in his theory. As such Probability
completes the explanatory picture.
Despite the inherent philosophical uncertainty that this solution introduced, Physicists could now accurately predict the beh avior of pure
matter from the level of the electron and proton all the way up to the galaxies and everything in -between. Probability was used to perform the
computations for the data sets of eternally identical atomic particles. When the matter became big enough, Mechanics (classic al Physics) with
his continuous equations took over. Physicists proudly claimed that they could predict the behavior of matter on a continuum from the
microscopic to the macroscopic. Probability and Mechanics were now wedded forever as modern Physics. The abilities of both we re required
to describe the behavior of pure matter. However when pure matter was polluted with life, this merger proved helpless. For in stance, Physics
has a hard time predicting where a chicken will land when tossed into the air.
Intoxicated by the explanatory and predictive powers with regards to pure matter of this merger of classical Physics and Prob ability, the
followers began claiming that they were on the verge of predicting the behavior of everything. A classic logical chain of sci entific
reductionism goes as follows: We can predict the behavior of the subatomic world, which provides the building blocks of th e atomic
world, which provides the building blocks of the material world, which provides the building blocks of the Universe. Ther efore we can
predict the behavior of the Universe. We just have a few details to work out.
This line of reasoning is confuted by the notion that the field of action determines precedence, not the building block/funda mental principle
mentality. For instance, while Physics informs us about all the incredible details of resonance, it tells us virtually nothin g about the music of
Bachs Brandenburg Concertos. In a similar fashion, the laws of Material Science provide Biology with some inescapable constr aints. While
these forms supply the essential structure that enables the development of the complexity required of living systems, they do not determine
meaning, the Music of Life, anymore than the laws of resonance reveal anything about the meaning of music. The field of actio n determines
the nature of the explanation. While underlying structure enables essential complexity, it does not determine meaning.
All the subtle concepts introduced by subatomic particles, the ambiguity and paradox, were swept under the rug to keep them o ut of sight.
The scientific community didnt want this uncertainty to taint in the triumphant union of Mechanics & Probability. This union is an
explanatory and computational tool that could describe the behavior of matter almost completely the operative word being almost. The
arrogance of certainty emerged almost immediately after the hubbub died down. After all Physics could confidently claim that he
could totally explain and compute everything that really matters - especially if all that matters is matter. Living matter is another
story.

Probability paves the way for the Living Algorithm.


How does this discussion apply to the Living Algorithms mathematical system of digesting information? Like Mechanics, the Li ving
Algorithm provides a system of dynamics. As a system of dynamics the Living Algorithm provides explanatory power that Probabi lity, as a
static system, can never hope to provide. While probabilistic descriptions provide boundaries, they can't possibly reveal und erlying meaning.
The Mathematics of Living Systems Page 27

static system, can never hope to provide. While probabilistic descriptions provide boundaries, they can't possibly reveal und erlying meaning.
This was one of the insights of Bohr's complementarity principle. It was possible to know either process or content, not both . This is another
reason that the two are complementary systems. The Living Algorithm reveals the patterns of the data streams process, while Probability
reveals the content of the data set.
Before Probability was required to plug the subatomic holes, continuous equations reigned supreme both scientifically and phi losophically.
With Probability's ascent, the importance of continuous equations as a philosophy waned. They became a great model rather tha n a definitive
feature of the Universal Substance. This dethronement of continuity opened the door for the Living Algorithms system of dyna mics, which is
digital.
Although the Living Algorithm system has very little in common with the material world from the atom on up, the Living Algori thms
patterns have much in common with subatomic wavicles. Schrdingers equation transforms the electron from a particle to a wav e. Born's
mathematical resolution transforms the material wave into a probability wave. Bohr's interpretation implies that probability is information
more than material. Hence the electron and photon become information waves, packets or pulses of information. The Living Algo rithms
basic manifestation is as a pulse of information - the Creative Pulse, a.k.a. the Pulse of Attention. In fact, the eyes sense the individual photon
as one of the Living Algorithms fundamental information pulses.
This analysis suggests that the triad of Mechanics, Probability and the Living Algorithm form a comprehensive interlocking sy stem that
incorporates both the static and dynamic nature of matter and life. Each mathematical system is necessary to explain and anal yze different
parts of the puzzle. There would be a conspicuous gap if any of the systems were excluded. The similarities and specialties a re shown in the
following diagram.

The bowed triangle in the center is where the three sets intersect as equation -based systems. The systems of Mechanics and Probability
intersect as studies of the material world. Mechanics and the Living Algorithm intersect as systems of dynamics. Probability and the Living
Algorithm intersect as types of measures. Each of the mathematical systems is unique in its own way. The Living Algorithm is digital; the
equations of Mechanics are continuous, and Probability's static.

The Living Algorithm specializes in the dynamic relationships between moments.


The Living Algorithms field of action is the ongoing dynamic relationship between the individual data points in a stream. Th is focus on
individual interactions gives rise to some intriguing features that are inaccessible to Probabilitys universal and static fo cus. Lets enumerate a
few.
The Living Algorithm focuses upon dynamic interactions. Accordingly, mechanistic concepts, such as force, energy, work, and power, can
be applied to data streams. Because Probability's data sets are static, they have none of these dynamic features. Because of her dynamic
nature, the Living Algorithm provides meaning and causality, which Probability, due to his innate fixed nature, can't. The ne xt volume in the
series Data Stream Dynamics provides an in depth analysis of the Living Algorithms dynamic nature.
Weve made references to the Living Algorithm Systems dynamic nature. This dynamic feature is due to the Living Algorithms ability to
characterize individual moments in a data stream by relating data points to each other. How does she accomplish this unusual, yet special,
feat? To find out the answer, read the next article in the stream Mathematics of Relationship.

The Mathematics of Living Systems Page 28

Mathematics of Relationship
29 November 2015
08:44 AM

The Mathematics of Relationships


2: Articles
3. Sections
4. Paragraphs

In prior articles we provided evidence to support the following claims: 1) Probability best characterizes the permanent and g eneral features
of fixed data sets. 2) The Living Algorithm best characterizes the changeable and individual moments of living data streams. As such, we
have chosen to call the Living Algorithm math, the Mathematics of the Moment. Because of the way the Living Algorithm digests data
streams, she also puts these moments in relationship to each other. Accordingly what happens at each instant in time has an e ffect upon
subsequent developments. Lets see how.
A. An Instantaneous Data Bit becomes a Moment

The Data Stream Mathematics of the Living Algorithm System specializes in relationships. This focus is due to the way in whic h the Living
Algorithm digests information. She takes raw data (which we will call instants) and spreads them over time (which we will call moments).
Graph A illustrates what happens when the Living Algorithm digests the number one (raw data/ instant) and transforms it into a moment.
The instant is the point when the data enters the System. The impact is greatest when the data (instant) first enters the Sys tem. The effect of
the impact on the System decays with each repetition of the process (iteration). This is why the measure of this diminishing impact is named
the Living Average. In this case, it took about 60 repetitions (iterations) until the original impact faded to 'practical' zero. It could be said
that the Living Algorithm transforms a one-dimensional entity into a two-dimensional entity. More simply put, the Living Algorithm gives
Data a meaningful dimension by spreading its influence over time. As we shall see, raw Data is not in an accessible form for the organism,
until it undergoes this transformation. Because moments are accessible, they provide the organism with meaning.
B. The Accumulation of Moments

With each repetition of the process, a new data byte (instant) is digested in a similar fashion. The Data enters the System a s a onedimensional entity in this case a one. The Living Algorithm then spreads the instants impact proportionately over time. This
process transforms instants into moments. With each subsequent iteration, each moment is layered on top of what went before,
i.e. the diminishing (decaying), yet influential, effects of the preceding moments (Graph B). Notice that the blue area in Gr aph A
is an enlargement of the small blue sliver at the bottom left in Graph B. All of the colors in Graph B represent the layering and
accumulation of moments.
Graph A was produced by a data stream consisting of a single 'one', followed by a string of 'zeros'. A data stream consisting of 120 ones
followed by 120 zeros produced Graph B. When there is just one data bit acting alone, its impact (the moment) on the system i s quite
small its maximum only .06. However, when there are series of moments as in Graph B specifically 120 of them, the total
collective impact on the system is much greater eventually rising to '1.0'. The collective force of the stream of ones has an
impact on the overall System that is inevitably 16 times greater than a single one acting alone. In short, there is a great er
impact upon the System when the Moments operate together moving in the same direction.
This type of interaction is additive (accumulative), which is certainly a valid form of interaction. Yet, there is another as pect of the Living
Algorithm's method of digesting information that integrates the Data in an even more complex fashion. The above analysis only applies to
the Living Average the simplest measure generated by the Living Algorithm. When more complex measures are introduced,
the resulting computational stew is an intriguing combination of interactions between data and measures. Without getting into
mathematical details, Chart C below indicates the complex weave between the Data and the Living Algorithm's Measures that
is required to produce the value of each new Measure. The columns of Xs without hats (the 2nd, 4th and 6th columns)
The Mathematics of Living Systems Page 29

is required to produce the value of each new Measure. The columns of Xs without hats (the 2nd, 4th and 6th columns)
represents the contributions of the Data, while the columns of Xs with hats represent the contributions of the Living Algorit hm
Measures.
C. Interlocking Interactions between Data & Living Algorithm Measures

Weve seen two ways in which the Living Algorithms method of digesting data streams creates a relationship between the momen ts. There
is one final way, perhaps the most significant for this discussion, in which the Living Algorithm produces an Interactive Sys tem. All the
triangles in Diagram C above indicate change. As such, all the interactions in the Living Algorithm System are based in chang e, the
differences between a complex mixture of Data and Measures. Accordingly no Moment exists as an independent entity in the dyna mic
System. Each Moment only exists in relation to the surrounding Measures and Data Points. Independent Existence is an Illusion . The only
Reality is constant Change, at least in the Living Algorithm System.
In Probability's System, the Reality is fixed and never-changing. This is why definitive predictions are possible. Permanence can be
characterized by definitive patterns. The event horizon can be narrowed to a single point - just one alternative, based upon initial conditions
and an equation. The mathematical functions (equations) of hard science epitomize this claim. In contrast, because the Living Algorithm
System is founded in dynamic change, only suggestive predictions are possible. Evolving transience is best characterized by s uggestive
patterns. The event horizon can only be narrowed, not eliminated. Identifying these suggestive patterns is the Living Algorit hm's specialty.
In summary, the Living Algorithm's method of digesting Data Streams produces an Interactive System. This occurs in three sign ificant
ways. 1) The impact of individual data points accumulates to have a greater general impact upon the System. 2) The Measures a nd Data
interact in a complex, interlocking fashion to produce derivative Measures. 3) The entire Living Algorithm System is based up on analyzing
differences between Measures and Data. As such, each moment only exists, not as an individual entity, but in a dynamic relati onship to the
preceding moments. Accordingly, it seems fair to say that the Data Stream Mathematics of the Living Algorithm System could al so be
called the Mathematics of Relationships.
To see what other features that the Living Algorithm has in common with living systems, check out the next article in the str eam Precision
vs. Fungible Meaning.

The Mathematics of Living Systems Page 30

Precision vs. Fungible Meaning


29 November 2015
08:45 AM

Fungible: To sacrifice Precision for Meaning


2: Articles
3. Sections
4. Paragraphs

As we've seen the Living Algorithm System and Living Systems share many features in common. This article explores yet another similarity
between the two systems. Both have a fungible component. Let us explore the meaning of the word fungible as it relates to these two systems.
Fungible is a legal term with the following definition: "(of goods contracted for without an individual item being specified) able to replace or
be replaced by another identical item: materially interchangeable: Money is fungible money that is raised for one purpose can easily
be used for another." A can of beans is also fungible in this legal sense because the exact number of beans has not been
specified. This characteristic of any can of beans must be ignored in a court of law. This implies that there is an acceptable
range of imprecision when considering a can of beans. The specific number of beans can have a wide range of values, as long
as the can has the same general net weight as claimed. In this sense the word fungible allows for an acceptable range of
imprecision in the application of law.
Drs. Jack Cohen and Ian Stewart in their book, The Collapse of Chaos, stretch the meaning of fungible to describe a unique aspect of living
systems having to do with the flexibility of interpretation. They point out that every culture, primitive or advanced, employs certain general
words to categorize birds of the same species. This is true even though the birds have an abundance of individual characteristics that separates
one from another. The word chicken can be applied to an entire group of birds because individual characteristics have not been specified. A
chicken can be large or small, young or old, black, speckled or patterned, and still be referred to as a chicken. To identify a meaningful
pattern, for instance the group chickens, it is necessary to overlook the individual characteristics of each bird.
Cohen and Stewart argue that living systems enlist this sense of fungibility to recognize meaningful patterns in their environment. The term
fungible suggests that a certain level of ambiguity is acceptable on the individual level if we are to make meaningful statements about the
whole. This is the sense in which we will use the word fungible.

Living Systems require Fungible Interpretative Mechanism


In this same sense, we suggest that fungible is a term that aptly applies to all living systems. Why?
Living systems require a certain tolerance for imprecision (ambiguity) in both interpreting and responding to the constant flow of
information. In order to recognize patterns, organisms must frequently overlook precise details in order to make rough (read practical)
comparisons. One cant see the forest for the trees is a common metaphorical expression that recognizes how a singular focus upon only the
precise details can obscure patterns. It seems that living systems recognize pattern by sublimating the potentially overwhelming abundance of
sensory details in order to make rough generalizations. This processing of information gives precise details a fungible character, which
approximates meaning.
Consciousness requires the raw data of sensory input. However, this data is only useful if it has meaning. Some mechanism must process the
raw data for it to make sense to the organism. The ongoing processing of information is a way of attributing meaning to sensory experience,
as we move through time. And because living systems move through time, these approximate meanings must adjust accordingly. This
essential meaning making process is fundamentally pattern recognition. These patterns provide the basis for the tentative working hypotheses
that organisms require for all significant interactions with their environment.
The interaction between sensory input and consciousness requires an interpretative mechanism. This interpretative mechanism must have a
fungible nature. In monitoring and adjusting to circumstances that are constantly changing, consciousness has a practical accuracy to it. In
fact, it is unrealistic to aspire to a higher level of accuracy. Changing conditions require living systems to be capable of utilizing a wide range
of behaviors in order to survive. For instance, Life requires a flexible response to sensory input in order to make an appropriate response to
the constantly changing circumstances that are an inherent feature of existence.
The mode of the explanation that describes this behavior must incorporate a tolerance of interpretation that sublimates detail for meaning.
This tolerance enables a unique response based upon context sensitive conditions. Due to the ever-changing nature of environmental
circumstances, a range of possible answers to a range of ongoing input is required. Any of information processing system that is employed by
living systems must have a fungible interpretative mechanism.

Living Algorithm provides Fungible Interpretative Mechanism


Guess what? The Living Algorithm System provides this crucial service. Perhaps this is not so surprising any more - considering how many
congruencies there are between the Living Algorithm and Life. Just as with relationships, this is due to the very way in which the Living
Algorithm digests information.
As soon as the precise raw data enters the Living Algorithm System, it is dumped into the evolving stew of her ongoing averages, as if it were
a spice. These rough averages are combined to form derivative or composite averages (the Predictive Cloud). Imprecision multiplied by
imprecision. This imprecision is an asset, not a problem. Zadeh's Principle of Incompatibility, previously explored, states that precision and
meaning are inversely related when dealing with the complexity of living systems. Precision up; meaning down, and vice-versa.
The Living Algorithm's composite averages, her Predictive Cloud, consist of the sublimated detail of the raw data. They provides meaningful
information that describes the nature of the data stream's current moment relative to what went before. This up-to-date information is crucial
for survival. The fragile organism might not survive if there is a significant delay in response to environmental stimuli. Sublimating detail for
generality facilitates essential pattern recognition that is at the heart of meaning. To provide a fungible interpretative mechanism the Living
Algorithm System sacrifices precision for relevance.

Memory Requirements: Living Algorithm Minimal Probability Prodigious


As soon as raw data enters the Living Algorithm System it is immediately dumped into a computational stew and is forgotten. After making
The Mathematics of Living Systems Page 31

As soon as raw data enters the Living Algorithm System it is immediately dumped into a computational stew and is forgotten. After making
its instantaneous impact upon the moment, the data is absorbed into the Systems ongoing measures the Predictive Cloud. The data
leaves traces of its impact, but its precise features are just a fading memory.
Probability, in contrast, must retain the precise features of each member of his fixed data sets. Remembering each of these values is essential,
if he is to adequately perform his primary function. Probability requires the perfect memory of a computer, or at least a ledger sheet, to
remember the precision of his data points.
Probabilitys task, as weve discussed, is providing measures that characterize the general features of his data sets. Computing the values of
these measures is tedious, some might even say complicated, to say the least. The Standard Deviations square roots are never that fun.
Psychologists everywhere breathed a sigh of relief when computers entered the scene to compute their statistical measures. Further, decades
of schooling are required to understand and employ Probabilitys many equations. Living Systems dont have this luxury. The urgency of the
moment demands an immediate response to preserve Lifes fragility.
Biological systems have a difficult time remembering anything that is not relevant. A pile of precise numbers from the past has no meaning,
except for what they contribute to the present moment. Because these precise numbers have no relevance, living systems would have a
difficult time remembering them. In contrast, the Living Algorithms Predictive Cloud provides crucial up-to-date information about data
streams that could be relevant to survival. Accordingly living systems could more easily remember the composite averages (the Cloud) that
the Living Algorithm provides. Infused with the emotion of survival, the Living Algorithm Measures are well worth remembering, especially
compared with the precise values of non-emotionally charged data points. The Living Algorithm Measures are also easy to compute, just one
algorithm and basic math (no square roots).
The memory and computational requirements of the Living Algorithm System are all within the range of any biological system, including
cells. The memory and computational requirements of Probability are well outside the range of any biological system including a genius.
It seems safe to say that Probabilitys obsession with the precision of his data prevents him from providing Life with the up-todate meaningful information that is essential for survival. In contrast, the Living Algorithms neglect of extraneous detail allows
her to provide the fungible interpretative mechanism that Life requires. In so doing the Living Algorithm incorporates ambiguity
into her System.

Could the Living Algorithm be Life's Operating System?


In the series of articles that constitute this study, we have developed the following ideas. Living Systems participate in a consistent interactive
feedback loop with their environment. Digesting or extracting information from the constant flow of environmental data streams is an
inherent feature of this biological process. The language of mathematics is ideally suited for this function. Accordingly, biological systems
require some type of data stream mathematics to digest this information flow. We deem this system the Mathematics of Living Systems.
To qualify as the Mathematics of Living Systems, this data stream mathematics must incorporate some key features. We have symbolized
these essential requirements in the words, Immediacy, Relationship, Ambiguity/Fungibility, and Choice. The Living Algorithms
mathematical system has fulfilled three of these requirements. Could it be that the Living Algorithm System is the operating system that
living systems employ to digest and thereby extract meaning from data streams? Based upon the abundance of supporting evidence, the
hypothesis that the Living Algorithm System is the Mathematics of Living Systems certainly seems plausible.
Let us suppose that we employ the Living Algorithm to digest numerical data. What about the multitude of information flows that cant be
assigned a distinct number? For instance, what about the relative terms that are so useful for organizing our world, such as lighter, bigger,
smaller, or smarter? How well does the Living Algorithm fare with non-numerical entities?
The next article in the series addresses this question. To see how every human utilizes the Living Algorithm formula to predict the future
multiple times daily, read The Living Algorithm Algorithm.

The Mathematics of Living Systems Page 32

Living Algorithm Algorithm


Encompasses Numbers & Words
29 November 2015
08:46 AM

Introduction to Algorithms
2: Articles
3. Sections
4. Paragraphs

"In mathematics and computer science, an algorithm is an effective method expressed as a finite list for calculating a function. In simple
words, an algorithm is a step-by-step procedure for calculations. Giving a formal definition of algorithm, corresponding to the intuitive
notion, remains a challenging problem." (Wikipedia) The 'step-by-step procedures' we learn in elementary school to add, subtract, multiply,
and divide large numbers are common examples of some simple algorithms.
The 'intuitive notion' of algorithm that we employ in our discussion of the Living Algorithm has to do with a 'step-by-step procedure for
determining an answer. In mathematics and computer science this answer must be precise and unique - right or wrong. This is also true for the
algorithm that determines the mathematical value of the Living Algorithm. However, when living systems employ the Living Algorithm's
algorithm to make reasonable predictions, the answer must only be close enough for practical purposes.

Humans employ Living Algorithms Algorithm to make daily predictions


This concept is not complex in any way. Everyone employs the Living Algorithm's algorithm to make predictions every day of our lives
frequently. For instance, we regularly make predictions as to arrival time. This prediction determines when to expect a child,
spouse, or friend to return home from school, work, or play; or conversely, when this assortment of individuals might arrive at their
variety of destinations. As might be expected, this prediction includes a most probable time combined with a range of possibilities
and, we will even suggest, a sense of the tendency (direction) of the pattern.
Let's turn these abstractions into a concrete example. We regularly make reasonable predictions as to when our partner will return home from
work. Perhaps the vagaries of demands or traffic provide a range of possible times that our loved one might arrive. We might phrase this sense
in the following manner. "She normally arrives home at 5:30 plus or minus a half hour. But recently she has been arriving a little later. So I'm
expecting her home by at least 6:15. If she's much later than that the 'worry switch' goes on. For she is never that late." This type of reasoning
is common for people of all races, ages, and sex. It might even be called a common sense.
However, to make this normal statement with any degree of accuracy requires a reasonable knowledge of at least 3 values: the expected time
(5:30), the range of times (plus or minus a half hour), and the direction (a little later). Further this statement requires a rudimentary sense of
the Bell Curve (never that late) and thresholds (the 'worry switch' goes on). The numbers are complicated averages and the concepts are fairly
sophisticated. Yet each of us, including some animals (feeding time), makes these computations on a daily basis regardless of socially
determined mathematical or scientific skills. Although some make better use of this information than others, this is not determined by scientific
or computational skills, but by common sense.
Crossing a threshold is frequently, if not always, the temporal event that triggers our decision to take action. Crossing a threshold also
determines many, if not all, of our automatic biological responses. The aforementioned quantities (most expected, range and direction) are
required to determine these thresholds. So how do we compute these values that are so important in our decision-making? Is it time to call the
police, get mad or just wait patiently? When does our stomach turn on the digestion fluids and when are they turned off?

Mean Average has Fatal Flaws


Of course the most important prediction has to do with the most expected time. How do we compute this value? A seemingly simple approach
would be to compute the ordinary mean average of everyday arithmetic. This would give us the average arrival time of our partner. What
would we need to compute this value? Just a simple list of arrival times. With this list we could use an elementary school algorithm (a simple 3
step procedure) to calculate the desired result. 1) Add up the arrival times to determine a Sum, 2) Count the number of arrival times, then 3)
divide the Sum by the Count. Three simple steps. Et voila! The average and most expected time of arrival.
Determining this value is simple, especially if you are a computer or an adding machine. The problem is in remembering all the specific times,
combined with counting the number of these values. Even remembering 4 arrival times is challenging, not to mention the addition and division.
Further, although these individual arrival times are important in terms of calculating the average, they have no emotional importance in and of
themselves. In contrast, the average (in this case, the expected arrival time) has emotional importance because as a predictive tool it sets
expectations. Because of the emotional component it is easier to remember. In fact, in neuro-science they call this emotional tagging and
claim this is a significant factor in memory.
The other problem with this form of computation has to do with the recency effect. In general, for living systems the most recent data input has
more weight (potential impact) than past data. Further the weight of the data fades with time. The most recent data has the most potential
impact, while the most distant data has the least potential impact. In terms of our example, we remember that our partner has arrived later than
normal recently. And we only vaguely remember that she may have come home late in the distant past. This has more to do with the
heightened importance of recent events in determining action, and less to do with the fading nature of memory.

Needed: Simple Algorithm with Complex Requirements


As living systems, we need a simple step-by-step procedure (algorithm) to determine an expected arrival time. However, the simple algorithm
has some distinct requirements minimal memory storage, minimal computation, and also takes into account the recency effect. Is it
possible to have a simple system that fulfills these complex requirements? Or should we just throw up our hands in despair? Does
God provide these essential predictive values? Or will Science eventually discover some complex neurological mechanism that
provides them?
Take a deep breath. Lets not give up quite yet. Lets go back to the beginning of this process. What was the only ingredient needed for the
The Mathematics of Living Systems Page 33

Take a deep breath. Lets not give up quite yet. Lets go back to the beginning of this process. What was the only ingredient needed for the
computation of the average arrival time? A simple list. Sounds like a data set. Lets turn our data set into a data stream. In daily life, we process
information as a stream, where the most recent points have the greatest significance (the recency effect). It is difficult to remember a simple list
of arrival times. Yet, remembering a stream of arrival times in the order of their importance is even more daunting. Terrible suggestion. Give it
up. Lets run for the hills before confusion overwhelms us. Data Streams incorporating the recency effect are an order of magnitude more
complex than a simple set of data.

Employing the Living Algorithm with Words, not Numbers


But wait a minute. Data Streams? Does that ring a bell? Of course! The Living Algorithms specialty is digesting data streams. Lets see how
her Family of Equations deals with the data stream of our partners arrival times. Hmmm? What is the first data point?
Yikes! Another roadblock. None of us remember exact times, just general notions of time a little before six, a little after six, right on time,
a lot after six, and way before six. How do we put a little before, or a little after into our internal computer? Even the Living
Algorithm needs numbers.
Boss, lets get out of here. This is never going to work.
Sloooow down, little brother. How about the Living Algorithms algorithm?
Living Algorithms algorithm?! I thought that only dealt with numbers?
An algorithm is just a step-by-step procedure. Maybe it can apply to words as well.
How can an equation possibly digest words?
If we dont look at it, how can we possibly say what it can do? Before running away, lets check out the Living Algorithm algorithm.

1) Find the difference between the most recent data point and the current average.
2) Scale this difference down by a some ratio (1/2 or less)
3) Add or subtract this scaled difference to the current average to obtain the new current average.
Three steps. Lets see how it deals with our verbal data.
Suppose our first data point is a little more. Lets see? 1) What is the difference between a little more and the expected arrival time (the
previous average)? Simple, a little more, nothing else. Whats next? 2) Scale down this difference. Now our value is tiny. 3) Finally, add this
tiny value (the scaled down difference) to the previous average to get the current average. Because our partner arrived home a little later than
normal we now expect them to be just a tiny bit later from now on. Simple. If the arrival time gets later and later, the expected arrival time (the
average) drifts slowly later. If the arrival times are erratic the expected arrival time hovers around a center point.

Living Algorithm Memory Requirements: a single emotionally charged Average


Sounds pretty reasonable; seems sensible; but maybe too simplistic. For instance, how does this process deal with the accumulating stream of
arrival time data?
With each new arrival time data point, a new expectation is formulated. Once this occurs previous expectations and the data that generated
them become as unimportant as yesterdays weather. Accordingly, the memory requirements of this algorithm are simple: just one value,
representing the current expectation. This expectation is constantly adjusting to the most recent input. There is no need to remember any of the
specific arrival times or how many of them there are. As such, the Living Algorithm algorithm is incredibly efficient, requiring only the
memory of the expected arrival time. Conversely, computing the simple mean average requires us to remember the number and values of a
sequence of arrival times.
There is a tremendous amount of emotion associated with the expected arrival time, as it sets thresholds of response. If ones expectations are
not met, then other thought processes kick in. For instance, if someone does not show up at the expected time, there is cause for concern. This
concern might trigger a different kind of response than just waiting. In general our expectations, of whatever nature, are charged with emotion
due to their connection with decision-making. Experimental results have firmly established that there is a positive correlation between emotion
and memory. Consequently the expected arrival time is also easy to remember.
We mentioned earlier that humans need a simple step-by-step procedure (algorithm) to determine an expected arrival time. This algorithm has
3 distinct requirements minimal information retention, minimal computation, and it must incorporate the recency effect (the most
recent information is most significant). The previous analysis established that the Living Algorithms algorithm fulfilled the first two
requirements. It is both easy to compute and easy to remember. But what about the recency effect? How does the algorithm deal
with this complex notion?
The Living Algorithm algorithm inherently incorporates the recency feature. The impact of each data byte (arrival time) is greatest when it first
enters the System (consciousness, in this case). After initial entry, the influence of the data byte begins to fade or decay. This is due to the
scaling down (step 2 in the algorithm) that occurs with each repetition of the process.

Living Algorithm Algorithm computes Data Stream's Range & Tendency


In our arrival time example, not only does our human compute an expected arrival time, but also the range and the recent trends of this
expectation. These capabilities seem to be part of our innate predictive repertoire when estimating our partners arrival time. For instance, our
expectation is not limited to a single number, but instead encompasses an expected range of possibilities and recent tendencies (plus or minus a
half hour and a little later than normal). Earlier it was claimed that these features are essential predictive tools in setting the thresholds that
determine our behavior fight, flight, sex, creativity, or hunger. How does the Living Algorithm algorithm deal with this feature?
The Living Algorithms algorithm determines these values with the exact same method that was used to determined the expected arrival time.
Let's start with a concrete example and then abstract it. Your partner tends to arrive home at 6PM plus or minus 15 minutes. This evening she
shows up at 7PM, a full hour later. Because the threshold of normal expectations was exceeded, your worry switch was beginning to turn on.
Although it was due to an accident on the freeway, your expectations adjust to the new input. Not only do you bump your expected arrival time
up, you also bump your expected range of arrival times up. In both cases, this adjustment is proportional to the difference. Perhaps you now
expect your partner home at about 6:05 plus or minus 20 minutes taking into account a possible delay.
The Mathematics of Living Systems Page 34

expect your partner home at about 6:05 plus or minus 20 minutes taking into account a possible delay.
Up to this point your partner's arrival time had been stable. Your expectations of recent tendencies weren't leaning either way. Because she
arrived a full hour later than usual, you assign a new later time to her recent tendencies. But the next evening, she arrives late again this time
at 6:45 PM. (Working late.) Because normal expectations were exceeded again, you might even be a mite irritated that your
partner didn't call. Again all three expectations are bumped upward: 1) the arrival time, 2) the range of arrival times, and the
recent tendencies of the arrival times. Again these adjustments are proportional to the differences. Maybe you now expect your
partner to arrive home a little later 6:10 PM with an increased range of 25 minutes, and certainly the recent trend is towards later.
With each new arrival time these three expectations are adjusted accordingly. No data need be retained. The expectations are immediately
adjusted and the exact data has no more value - like yesterday's weather. Although you may store extreme values, there is no need for any
database to compute the expectations. No numerical baggage need be lugged around. Although we assigned numbers to these changes, we
could just as easily have assigned words. For instance: 'She has been arriving home a lot later, just recently.' Or 'Her arrival times are much less
stable than they used to be.'
Because expectations determine the threshold of response they have an emotional component as well. As a personal example: Herbert, an older
German waiter, always showed up 10 minutes early for his shift. One time he was 5 minutes late and everyone began worrying. Herbert's range
of expected arrival times, as determined by his past performance, was so narrow that even a slight variation was alarming. In Herberts case 5
minutes late was a lot later than normal. In contrast, my Person always showed up a little late. Even though he was later than Herbert on this
night, no one paid it any mind. Because he was in the range of his expected arrival times, no thresholds were crossed. No need to contemplate a
change in behavior. With Herbert, the manager had already begun to wonder if he should call the police. As this example shows, when people
behave outside of their usual expectations, others may respond emotionally, in this case with concern.
An awareness of the evolving features of your partners arrival times leads to estimates concerning her future performance. These estimates
lead to expectation. Expectations tend to be emotionally charged as they determine the thresholds of response. When expectations are met,
such as the partner arrives home on time, no thresholds are crossed and no action needs to be taken. When expectations aren't met, such as the
partner arrives home much later than usual, the threshold of expectation is crossed, and decisions need to be made about what should be done.
Because the three expectations (expected arrival time, range of arrival times, and recent trends) have emotional content, they are much easier to
remember. Scientific studies are conclusive. Information that is emotionally tagged is much easier to remember than raw facts.
Computationally this is a very economical system in terms of memory, computation and relevance. The Living Algorithm digests a single data
stream to create 3 very different expectations. Mathematical residue from the primary computation becomes the data in the secondary
computations. It is not necessary to retain the data that goes into making these computations. Only the current adjusted expectations regarding
arrival times have any meaning or relevance. One algorithm digests a single raw data stream to produce three emotionally charged
expectations. Due to this emotional component these values are easy to remember.
Mathematically, the Living Algorithm creates 3 evolving, composite data streams from the ongoing raw data of arrival times: 1) probable
location, 2) probable range of variation, and 3) recent trends. Each of the data streams is characterized by a single value (not a database) that
describes a unique feature of the most recent moment. The primary composite data stream determines the most probable arrival time. It is
based upon the difference between expectation and the most recent arrival time. The Living Algorithm employs this original difference
(residue from the initial computation) as the data for the other streams. In the case of the trends of the data stream, the original difference is
digested as it is (positive or negative - a vector), In contrast, to determine the expected range, the initial difference is digested as a positive
number (a scalar).

Living Algorithm Algorithm encompasses Words & Numbers


It is evident that the simple Living Algorithm algorithm fulfills the complex requirements of our arrival time scenario. It generates 3 powerful
predictive tools in a simple fashion without requiring too much memory. The recipe also incorporates the recency effect and decay into its
averages (measures). Further it is easy to see that this algorithm could be the one we all use when predicting arrival times. The Living
Algorithms computational requirements are natural and its memory requirements are simple and emotionally charged.
From this example and line of reasoning it is evident that the Living Algorithm Algorithm deals with imprecise verbal data as easily as it deals
with precise numbers. As a mathematical algorithm, the results are precise, exact, right or wrong just as in any other mathematical or
computer-based algorithm. As an experiential algorithm, the results, although inexact and vague, are crucial in our ability to make
regular and useful predictions about the behavior of our environment. This analysis supports the claim of our title the Living
Algorithm's Algorithm encompasses both words and numbers. Another amazing feature of our marvelous Living Algorithm.
The Living Algorithm's mathematical system incorporates immediacy, relationship and fungible ambiguity. Further the Living Algorithm's
algorithm can digest both numbers and relative terms. Can the Living Algorithm incorporate choice into her system? For answers to this
question check out the next article in the series The Mathematics of Informed Choice.
To see what Life thinks about this article and what her concerns are, return to our metaphorical world and read Comfortable with Living
Algorithm's Algorithm, Life wonders about Choice.

The Mathematics of Living Systems Page 35

Mathematics of Informed Choice (vs. Deterministic Physics)


29 November 2015
08:46 AM

The Potential for Informed Choice: the final Requirement


2: Articles
3. Sections
4. Paragraphs

In Data Stream Mathematics, the initial article of this volume, we argued that biological systems require some type of mathematical system to
digest the ongoing flow of environmental information. This data stream mathematics of living systems must satisfy some precis e requirements.
This mathematical system must be able to address the immediacy of the moment as well as the ongoing relationship between mome nts. Further,
the mathematics must provide some type of fungible interpretative mechanism that sacrifices precision for meaning. This meani ng must be both
descriptive and predictive.
In the volumes subsequent articles, we demonstrated that the Living Algorithm fulfills these requirements for a mathematics of living systems.
Conversely, Probabilitys mathematical system is unable to satisfy these requirements. There is an innate reason for these di fferences in ability.
Probability asks the question: what is the mathematical nature of fixed data sets? Conversely, the Living Algorithm asks the question: what is
the mathematical nature of dynamic data streams? The question that is asked determines the nature of the answer. Consequently Probability
delivers answers that are related to general characteristics of fixed data sets, while the Living Algorithm provides answers that are related to the
individual characteristics of dynamic data streams.
The data stream mathematics of living systems must incorporate yet one more requirement. The system must include the possibil ity of
interaction with the environment. This interaction is an essential feature of living systems as it enables the ability to mon itor and adjust to
external circumstances in order to survive. In other words, the mathematics of living systems must also incorporate the possi bility of Informed
Choice.
Can the Living Algorithm fulfill this crucial requirement? Traditional Physics, i.e. Mechanics, also specializes in character izing data streams.
What are the differences in the approaches of these 2 mathematical systems to data streams? Does Mechanics incorporate the po ssibility of
Informed Choice?
The fundamental difference between the two systems is rooted in the basic equations that each employ to characterize existenc e. The equations
of classical Newtonian Physics utilize an infinite and continuous stream of numbers. As such, the focus is upon a number line. The Living
Algorithm is digital, in the sense that her number stream is comprised of discrete points. As such, her focus is upon individual numbers. This
seemingly small difference is the hairbreadth that leads to entirely different conclusions regarding life and the fundamental nature of the
Universe. To see why, let us establish a historical context.

The Historical Context of Newtonian Physics


In the ancient world, human culture in general tended to attribute animistic characteristics to matter. In other words humans tended to
anthropomorphize their world. The animate and inanimate universe was infused with the same undifferentiated spirit. In this s ense the ancient
world believed that even matter was alive, or at least had life-like properties.
Then about a half millennia ago, along came Galileo. By rolling a variety of balls down an inclined plane, he found that each ball, regardless of
size or weight, had the same constant acceleration. It was evident that instead of having a motive force of their own that ma tter obeyed universal
laws of motion.
A century later, Newton further articulated these universal laws of motion. It was found that these universal laws governed b oth the celestial
and the terrestrial worlds. That was certainly a mindblower. Prior to this point, most philosophers and theologians were firm ly convinced that
Heaven and Earth obeyed different laws, if they obeyed any laws at all. Now, not only had Newton formulated some laws that go verned the
motion of matter upon Earth, but these same laws also governed the revolutions of the celestial bodies in the heavens as well .
To solve these mind-boggling riddles, Newton derived a new mathematical method called calculus. Calculus specialty is multiplying the
infinitely small by the infinitely large, to come up with a normal everyday number. For instance, the mathematics of calculus enables scientists
to add up an infinite number of infinitely small points to get the length of a line. This unusual approach was instantly rati fied because these
computational results correspond with material reality. In other words, Newtons calculus computes planetary position as we ll as the time it
takes for an apple to fall to the earth.
This type of computation also creates the notion of a continuous unbroken space and time. This notion syncs up perfectly with the everyday
world in which we live. None of us worries about tears or breaks in the fabric of space and time. Further, the equations were able to precisely
predict the behavior of matter. These exact correspondences reinforced the notion that reality was essentially continuous. As a further
affirmation of this continuous worldview Physics, in the Newtonian sense, has successfully employed continuous equations to precisely
describe the behavior of matter. And this ability has enabled humans to transform the world in unimaginable ways.
Intoxicated with this extraordinary success regarding the material world, philosopher/scientists generalized these findings t o the biological
world of living organisms. Their reasoning was straightforward. The equations of Physics can predict the behavior of matter w ith nearly
absolute accuracy. Life is composed of matter. Thus, these universal laws of motion also determine living behavior. In other words, we live in
an automatic material world where everything is predetermined. Choice is but an illusion concludes this logical chain. The co ntinuous equations
of Physics lead to the philosophical faith in scientific determinism.
Because of the proliferation and dominance of these continuous equations of classical Physics, it was further believed that t his was the only
type of equation. To indicate the importance of continuity to this perspective, Dirac won the Nobel Prize for uncovering a te chnique that
transformed discontinuous functions into continuous ones. As we shall see and have seen, the Living Algorithm is quite differ ent from the
classical continuous equations of Physics. Physics specializes in number lines, while the Living Algorithm specializes in num bers. Accordingly,
we say that the Living Algorithms method of digesting information is digital in nature.

Physics' Continuous Analog vs. Living Algorithm's Discontinuous Digital


The Sine Wave
The Mathematics of Living Systems Page 36

The Sine Wave

The Triple Pulse

Let's examine the differences between the traditional continuous analog equations of Physics and the digital feature of the L iving Algorithm
from a visual perspective. At right is one of the fundamental (perhaps even quintessential) graphs of Physics the classic sine wave. The
sine wave isTHE BASIS OF
such universal phenomena as electromagnetic waves and spring action. It is based upon initial conditions and is continuous, a s are virtually all
the equations of classical Physics.
Although the three alternating pulses of the sine wave and the Triple Pulse have many apparent similarities, the method emplo yed to generate
these two graphs is as different as night and day. Although both are seemingly flowing curves, the first is based upon a cont inuous analog
equation, while the second is based upon a discontinuous digital equation. In fact, the two types of graphs seem so visually similar that it took
the Author over 8 years to realize that there is a fundamental difference between the two.
He was further amazed to find that this mathematical difference articulates a key point of departure between the traditional and the new
scientific perspective The traditional approach emphasizes automatic processes, while the new scientific approach stresses th e potential for
informed choice. (These ideas are detailed more completely in A New Age of Science.) As we shall see in the following paragraphs, the
continuous equations of Physics inherently exclude the possibility of choice. In contrast, the potential for choice is inhere nt to the Living
Algorithm System.
A Close-up of the Sine Wave

The continuous analog equations of Physics have no wiggle room. No matter how much the graphs of these equations are enlarged , they remain
a smooth curve. As an example, lets view a close-up of the classic sine wave of Physics (shown at the right). Note the curve remains unbroken.
No matter how many times the graph is blown up, the curve will remain continuous.
In contrast, the Living Algorithm System contains an abundance of wiggle room. This is due to the digital nature of the Livin g Algorithms
method of digesting information. With each iteration (repetition of the digestive process), a new piece of data enters the Li ving Algorithm
System. The union of the Living Algorithm and the Raw Data Stream produces an ongoing Family of Measures. These Measures repr esent a
smoothing out of the Datas potential roughness. However, no matter how many times this smoothing out process is performed, t he resultant
measures remain discrete. As contrasted with Physics invariable automatic continuity, there is absolutely no connection betw een the points in
the ongoing data streams that the Living Algorithm generates. In fact, the elements of the data streams, like data sets, are inherently distinct.
A Close-up of the Triple Pulse

The Mathematics of Living Systems Page 37

A close-up of the Living Algorithms Triple Pulse is visualized in the graph at the right. The image illustrates how she is made up o f distinct,
rather than continuous parts. The apparent continuity in her classic representation is just an illusion. The illusion of cont inuity is due to the large
number of iterations (repetitions). In similar fashion, the characters in movies and cartoons appear to move continuously, bu t are instead based
upon distinct frames that are shown rapidly enough that our visual processor turns them into a moving picture. In the case of the Triple Pulse,
the bars create a similar effect to the cartoon frames. When there are enough of them, they give the image of the Triple Puls e a continuous
appearance.
As weve seen, the Living Algorithm digests external input in a digital fashion. As such, each data point in the stream of in formation is
discrete/individual. In other words, there is space between each data point. This space provides time to evaluate the m eaning of the signal
and respond. Accordingly, the Living Algorithms digestion method incorporates the ability of an organism to monitor and adju st to the
environment. The capacity to monitor and adjust is an essential ingredient of the ability to choose the essence of informed choice. It is
evident that Living Algorithm mathematics incorporates the possibility of Choice, an inherent ability of living systems. In c ontrast,
the continuous equations of Physics provide no opportunity for an interaction with the environment.

Living Algorithms Fresh & Free Data vs. Physics Hard Data
To further assist our understanding of how the Living Algorithms method of digesting data enables the potential for choice, let us contrast the
relation the Living Algorithm and Physics have to their Data.
The Living Algorithm requires an ongoing flow of fresh raw data to fuel her System. Further this data is free, in the sense that it is not
predetermined by the Living Algorithm. In fact, the Data is entirely independent of the Living Algorithm. Instead of describi ng her Data, the
Living Algorithm organizes her data. This is of great use to Life, as the process provides the fresh & free Data with meaning .
In contrast, Physics has an entirely different relation with his Data Streams. Instead of organizing his Data Streams, he dom inates them with his
continuous, automatic equations. Physics only needs Data to determine his absolute formulas. Once Physics gives birth to his magnificent
equations, he abandons his Data.
After the derivation of the mathematical formula, Data is unnecessary (except to maybe check results). Physics only needs the starting point
(the initial conditions). Once the initial conditions are determined, any computer can crank out the results of continuous (a nalog) equations. The
results of this amazing form of analysis can include the position, velocity, acceleration and force of virtually any material system. Plug in the
initial conditions and out comes the results. The graphic visualization of these results generally includes a continuous curv e from the distant
past into the infinite future. Everything follows automatically according to the immutable laws of the equations derived by the immortals
(Newton et al).
This model drives the philosophy of scientific determinism. Under this way of thinking the initial conditions at the moment o f the Big Bang
determined everything that has transpired since - music, civilization, even your relationship with your dog - everything. God/Science knows all.
The only choice is to set the initial conditions. Once these are set, every point in the data stream is predetermined. No mor e choices. This is why
we say that Physics dominates his Data Streams with continuous Equations a Master/Slave relationship. (Unbelievably, a significant
group of humans actually believe that Physics will eventually dominate every data stream with his marvelous equations.)
Because the equations of Physics are derived from an examination of the Data, the accuracy and precision of the Data is of pa ramount
importance. To indicate this importance Physics refers to his data, as hard data the harder the better. Because his equations are able to
absolutely dominate this hard data, Physics is referred to as a hard science, perhaps the quintessential hard science.
Another indication of the importance of Hard Data is that theory and accuracy of measurement have moved hand -in-hand one or the other
leading. As an example of data leading theory, the accurate mapping of the position of the stars by ancient civilizations eve ntually
led to Ptolemys planetary theory. An example of theory leading data: Copernicus revolutionary idea that the earth revolves
around the sun was finally validated a few centuries later, when advances in technology made it possible to more accurately
measure planetary position. In the case of modern Physics, unusual experimental discoveries that were made possible by
advances in measurement technology led to the brilliant theoretical formulations of Einstein, Heisenberg, Feynman et al. This coevolution of Hard Data and Theory characterizes the Hard Sciences.
In contrast to the equations of hard sciences, as epitomized by Physics, the Living Algorithm is not derived from the Data. H er function is to
digest Data Streams - crunch an ongoing flow of numbers into an ongoing flow of central measures. This process organizes data streams to
reveal their meaning. As mentioned, the Living Algorithm does not predetermine the values of the instants in her Data Streams in any way.
Instead, she waits patiently for the next Data Point. Her suggestive descriptors give an idea what the value might be, but do not in any way
determine it. There is nothing automatic about her suggestive predictions. Her data is free.
But if the data in the stream is truly free, what can be said about it that is meaningful? Lets see.

Living Algorithm transforms Meaningless Instants into Meaningful Moments


The Living Algorithm digests a data point each step (iteration) along the data stream. We choose to characterize the individu al data point as
an instant. (Please note: the measurement of the length of our notion of an instant is necessarily arbitrary.) The Living Algorithm process
digests data points by creating a context that relates the current instant (data point) to the immediately preceding instants (data points). Each
time the Living Algorithm repeats her relating process, she determines what we define as a moment.
According to our definition, each instant is independent. The Living Algorithms ongoing Family of Measures determines the ongoing
relationship between these instants. Without this context, instants are inherently devoid of meaningful pattern. Due to the context provided by
The Mathematics of Living Systems Page 38

relationship between these instants. Without this context, instants are inherently devoid of meaningful pattern. Due to the context provided by
the Living Algorithm, moments are inherently filled with mathematical meaning. In essence, the Living Algorithm transforms
meaningless instants into meaningful moments. Or yet another articulation of this process: the Living Algorithm's digestive system provides
meaning to Raw Data.
But this analysis is slightly misleading. The Living Algorithm is only able to transform the instants of a raw data stream into
meaningful moments if there is biological organism to interpret or translate her message. The Living Algorithm only determines the
mathematical nature of each ongoing moment. The organism must interpret these essential clues to give them meaning. The state ment: The
Living Algorithm provides meaning, is just shorthand for the above analysis. It is important to always remember that a biolo gical intermediary
is required to interpret the Living Algorithm measures and give them meaning. This factor becomes very important when we disc uss the mass
of attention.

Potential for Informed Choice


To see how the Living Algorithm process incorporates the possibility of choice, lets again contrast her tao with the tao of Physics.
Let us suppose that an organism is given the continuous equations of Physics to determine the nature of the next moment in th e data stream. The
organism may as well put everything on automatic, as all future points are predetermined with no possibility of choice.
Now let us suppose that the organism utilizes the Living Algorithm to determine the nature of the next moment. The Living Alg orithm,
according to her nature, digests the data stream and provides the organism with ongoing, evolving measures that describe the trajectories of
each moment. These descriptions also provide estimates of future performance. With each iteration (repetition), the organism has the
opportunity to evaluate the moment according to the Living Algorithms measures and then make an educated choice as to what t o do next,
based upon this information. This opportunity for informed choice exists at every point in the data stream. This is called an interactive feedback
loop. Stimulus/response, monitor/adjust, give and take are other ways of referring to this omnipresent phenomenon that occurs every moment
that an organism is alive.
Reiterating: in the Living Algorithm System there is the potential for choice from moment to moment. Accordingly there is abu ndant wiggle
room in this digital system, which does not exist in an analog system such as Physics. Further, the potential for choice is i nformed, not random,
because the Living Algorithms measures provide estimates regarding the future.
In the case of the Triple Pulse System, which we previously referred to, the choice is simple, whether to participate or not On or Off - Yes or
No. This choice has a direct effect upon the future. Each choice ripples through the subsequent system. The multiplicity of e ffects
includes: increasing intensity, killing the intensity, allowing the system to settle down, or even disturbing the peace. This living
action, these effects are illustrated in the graphic visualizations of the Triple Pulse computer experiments, chronicled in Triple Pulse
Results. Each choice in the Triple Pulse System has a significant impact upon subsequent developments.

Closed Systems of Matter vs. Open Systems of Life


The traditional analog system of Physics has a very different relationship to the potential for choice. Once the starting poi nt (initial conditions)
of the equations are set, the relationships between the variables in the equation determine all future circumstances. An exte rnal force is required
to change the state of the system. This verbalization is one of Newtons marvelous laws of motion. The equations of Physics a ccurately
determine all material behavior. Without an external force, behavior is automatic. Once the initial conditions have been set, the results of the
equations of Physics are predetermined. No new information can be incorporated. The equations operate in a closed system.
There is no such thing as a closed system in the Living Algorithm world. This is inherent to the methodology ( tao) of the Living Algorithm.
The Living Algorithms method of processing is porous to the external world. With each iteration (repetition of the process) a new byte of info
energy from the outside enters the Living Algorithm System. Each new info byte stimulates change to the System and therefore provides
energy. (We will deal with info energy in the Data Stream Dynamics notebook.) For example, in the case of the binary (1 or 0) Triple Pulse
System, each data bit (1 or 0) may or may not be a pulse of info energy depending on which number it is. Further, the individ ual value of each
elemental bit has an effect upon the future of the System. This porousness to the outside indicates that the Living Algorithm System is not
closed.
The material systems of the hard sciences epitomized by Physics are always closed. In contrast, the info system of the Living Algorithm is
never closed. This difference has some important implications. Because Physics operates in a closed system, it is possible to formulate the
automatic laws of matter. Because the Living Algorithm operates in an open system, it is impossible to determine any definiti ve laws. The
closed systems of hard science mathematics are intertwined with the conservation of energy a core principle of material sciences. In other
words, material energy is conserved in a closed system. Because variable amounts of info energy enter the Living Algorithm
System with each iteration, the conservation of energy cannot possibly apply. Closed systems that characterize the material w orld
are inherently not sensitive to environmental context. In the Living Algorithms open system context is an omnipresent featur e.
Finally, closed systems by nature cannot possibly incorporate the potential for choice. In contrast, the potential of choice is
inherent with each repetition of the Living Algorithm process.
To cement understanding let's look at this difference from another perspective. The basic equation for the analog graph comes in this form: y = f
{x}. This has a simple meaning. If x, a variable, is assigned any number, then y, another variable, is automatically assigned another specific
number. Further, x can be any number, real, imaginary, or complex. Each pairing of x and y is true forever and always, no mat ter what went
before or after. The content is of paramount importance. In contrast, while the Living Algorithm could be written in same for m y = f {x}, the
first variable, x, can't just be any value. It is a specific value in an ordered data stream. Further the value of y, the oth er variable, is totally
dependent upon what went before. The context is of paramount importance in determining the result.
One of the amazing discoveries of Science is that the material world operates in a closed system, where energy is conserved. Due to this closed
nature, abstract equations can be written which accurately characterize the relationships of matter. The bound nature of the system enables
mathematicians to specialize in ever-increasing complex equivalencies between abstractions and their derivatives. Of course, living systems
have a material component that obeys these same laws. Intoxicated, however, by the success of this approach to the material w orld, some
scientists infer that hard core rules dominate all features of existence (the root of scientific determinism). However, Life, like the Living
Algorithm, exists as an Open Information System. There is an inherent permeability between the internal and external world th at characterizes
The Mathematics of Living Systems Page 39

Algorithm, exists as an Open Information System. There is an inherent permeability between the internal and external world th at characterizes
all biological systems, from a single celled amoeba on up. In living systems there is a constant give -and-take (stimulus/response) that, enables
the possibility of informed choices.
Newtonian Physics does not incorporate the potential for stimulus and response. The traditional analog equations of Physics c ombined with
initial conditions are the sole determiner of future events. Conversely the Living Algorithm, like Life, incorporates the pos sibility of choice with
her digital equations. This give and take relation to data streams is yet one more similarity between Life and the Living Alg orithm.
Despite their many differences, the mathematics of the Living Algorithm and Physics are bound together by a common thread which could
more accurately described as a super highway. The two forms of mathematics, with their uniquely different fields of action (l ifes
informed choice & material determinism), are bound together by classic Newtonian concepts, such as force, work and power,
mass, space, and even time. These concepts, while providing a common element between the two polar systems, have radically
different manifestations. To see where these complementary systems merge and diverge, check out the next notebook Data
Stream Dynamics.
To begin to understand this intersection between two orthogonal planes of existence we must first get to know the Living Algorithm a little
better. We have seen her in action, but we have yet to meet her. The initial article in the stream sets the stage with an exp loration of the Living
Algorithms algebra. Dont worry; the discussion doesnt require a mathematician. The Living Algorithm is a simple equation, only requiring a
basic knowledge of arithmetic. To better understand the underlying patterns of this unique equations innate nature, check ou t the Living
Algorithm, her Instantaneous Self.

The Mathematics of Living Systems Page 40

Life searching for a Mathematics for the Moment


29 November 2015
08:49 AM

CE1-2: Life searching for a Mathematics of the Moment


2: Articles
4. Paragraph

Let's encapsulate our story. Life is searching for a mathematical partner that will be sensitive to her subtle immediacy. Pro bability's Data Set
Mathematics with his big picture focus accurately captures the features of the general population. However, because of this s pecialty he
doesn't have the tools to understand Life's immediacy. Needless to say, her relationship with Probability has proved disappoi nting. His
constant focus on providing measures for her fixed data sets has provided stability, in that he has accurately characterized her permanent
features even making definitive predictions about her fixed nature. Yet, Probabilitys style, while dependable, has frustrated
Life.
To understand her subtle and immediate nature Life requires a mathematics of data streams. Her subtle character is more assoc iated with the
momentum of recent moments, than it is with fixed and permanent features. If she hears of her general tendencies one more tim e, she is
going to scream. She almost feels that Probability is objectifying her, rather than appreciating her for who she is and the c haracteristics that
make her special. He even trivializes her ongoing data streams by transforming them into fixed data sets. While Probability a ccurately
characterizes these fixed data sets, Life wants a mathematics that is sensitive to her ongoing data streams.
But this new mathematics of data streams cant be just any old data stream mathematics. This new mathematics must fulfill som e stringent
requirements, if it wants to be considered the Mathematics of Living Systems. Life is very particular about who she partners up with. To be
sensitive to her needs, this mathematics must weight the current moment more heavily and provide ongoing predictive descripto rs that
pragmatically characterize the trajectories of the moment. Further, due to Life's inherently changeable nature, she requires suggestive
predictors that incorporate a range of possibility. This relative imprecision is an asset, not a liability. Probability's def initive predictions are
too exacting and general to be sensitive to Life's contextual spontaneity. Life does not want to be boxed in. She has felt su ffocated by
Probability's approach. To form a new mathematical relationship, Life is looking for a Data Stream Mathematics that is sensit ive to the
special meaning of the moment in short, a Mathematics of the Moment.
Where is Life going to find this special mathematics? Certainly not at a singles bar. Are her requirements too strict? Is she doomed to
mathematical isolation her subtle immediacy unappreciated? For some preliminary answers to these questions, read the next
article in the stream The Living Algorithm System. To continue with the metaphorical perspective, read on.

CE3-4: Is the Living Algorithm just an insignificant subset of Probability?


2: Articles
4. Paragraph

Life reflects upon her search for a mathematical partner.


"I want a guide that will assist me on my Journey - perhaps provide some direction. I even remember spewing to a friend just a short time
ago. "Matter has his math. I feel it only fair that I should also have a mathematics of my own one that is sensitive to my needs - not
just his. I won't mention any names, but one of my suitors actually ignorantly stated, "I'm convinced that Matter's Math will
eventually be sufficient for both of you." Who did he think he was talking to a lump of clay?
And then along came the Living Algorithm. I certainly wasn't impressed when I first met her. But I'd heard good things. One o f my good
friends had introduced us. We went out a few times. She came dressed as first the Creative Pulse and then the Triple Pulse. I came as
different forms of Human Behavior - mostly related to sleep. We got along famously - agreeing on everything.
My hopes were rising, but my standards are high. I'm big on data streams; they're everything for me. Without the ability to d igest data
streams, I am nothing - dead, just like an inanimate piece of matter. So it stands to reason that if a mathematics is going to provide me with
guidance, it must also be a specialist in my beloved data streams. In an earlier discussion with Alga, she provided evidence that her sole
function is to digest data streams. I was so thrilled.
But then Probability, another suitor tried to horn in. "I also digest data streams. They are my specialty as well. Further, I am far more suited
than the Living Algorithm to characterize your features." He then boasted, "Within my system I have a multiplicity of fancy m ethods of
complex analysis. The simplistic Living Algorithm is all by herself. What guidance could she possibly provide that I couldn't do better? In
fact, she is just a subset in my domain, a very small subset, at that. Actually, not statistically significant at all."
I had long since grown tired of Probability's obsession with my general features. But his words disturbed me. Was the Living Algorithm just
an insignificant subset of Probability? Was Probability really the mathematics of my dreams? In my heart I didn't think so. P robability is just
sooo rigid. Everything is either right or wrong - within some specific limits and such. Don't get me wrong, Probabilitys boxes provide broad
guidelines - boundaries of behavior and such. But something is missing in his approach - my individual special nature.
I hate to admit it, but what he said tweaked me a little actually disturbed my meditation probably because of his prestige. I
decided to go to the Living Algorithm and confront her directly. She immediately began laughing. "Probability says I'm just a n
insignificant subset of his? Hah! Forgive his ignorance. He's a bit insecure about all the phenomena he can't address. For
instance he couldn't recognize a Moment, even if it was part of his beloved sets. I am not subservient to him; we are
complementary systems. We each have a unique purpose and field of action. Although we are both equally obsessed with data,
he processes data sets, while I digest data streams. To analyze my data streams he kills them by transforming them into data
sets - like capturing a wild animal to study it in a zoo. Not the same.
Alga: The last article compared our respective approaches to baseballs batting average. Probability is more effective in ch aracterizing a
baseball players entire season, while I am more effective in characterizing the players recent performance. Probability pro vides an estimate
as to how well the player will do in his next season, while my predictive cloud provides an estimate as to how well the playe r will do in his
next at-bat. The information that Probability provides is more useful to the baseball community when determining seasonal awards and
annual salaries, while my predictive cloud has more utility to players, coaches and gamblers, when making immediate decisions on game
The Mathematics of Living Systems Page 41

annual salaries, while my predictive cloud has more utility to players, coaches and gamblers, when making immediate decisions on game
day. This example shows that rather than being subservient to him, we are complementary systems. He specializes in general fe atures of the
data set, while I specialize in the individual moments of a data stream.
There might be some who still feel that I, the Living Algorithm, am but a subset of Probability. To sweep away any remaining confusion as
to our relationship, read the next article in the stream Mathematics of the Moment (vs. Probability). Some of my good friends wrote it. I
think they did a pretty good job."
To remain in the metaphorical world, read on.
Ads by DNSUnlockerAd Options

CE4-5: Probability challenges Living Algorithm's scientific credentials.


2: Articles
4. Paragraph

Life was relieved to find that the Living Algorithm's claims are true.
Life: "The Living Algorithm is not a subset of Probability, but a complementary system. Further, Alga's Predictive Cloud prov ides relevant
information regarding individual moments, something that is very important to me. Could she be the mathematical system of my dreams? Is
it possible that she could reveal some codes to my living matrix that will enable me to better actualize my potentials?"
This pleasant reverie was disturbed when she saw Probability striding confidently towards her obviously with some purpose in mind.
Life could sense from his jutting chin that he still had bones to pick over their last interaction.
After pleasantries were exchanged, Probabilty asked in a not-so-innocent fashion: "So how is your mathematical relationship developing
with the Living Algorithm?"
With a twinge of the victor, Life: "Great. She is able to address aspects of my innate being that the rest of you have ignore d."
Probability: "So we're not good enough for you anymore?"
Life: "Sorry. I didn't mean to be offensive. I love you all. I really appreciate the unique form of guidance that each of you provides. Physics
really understands the dynamics of my matter, while you are a specialist on my general features. I'm especially excited about Alga right now
because she specializes in my immediacy and my ability to choose. As complementary systems, each of you addresses a different side of my
innate nature. Remember in our last encounter, we found that Alga is not a subset of your system, but that you are complement ary systems
instead."
Life could sense that she had pushed some buttons because the muscles in Probability's smile tightened up into a grimace.
Probability: I accept the argument that the Living Algorithm is not my child. Despite our common obsession with Data, we hav e unique
fields of action, mine data sets and hers data streams. Consequently the questions we ask and the answers we get involve uniq ue, yet
complementary, matrices of thought. She characterizes moments in the data stream, while I characterize entire sets. The Livin g Algorithms
results are ever changing, while my results are permanent and never changing. I appreciate the pragmatic utility of the infor mation that the
Living Algorithm provides. However the transitory nature of this information combined with the individual nature of the data streams she is
analyzing limits, if not eliminates, any scientific value of her analysis. In contrast, the permanent nature of my results co mbined with the
general and fixed nature of my sets renders my analysis perfect for the scientific community. Because of my talents scientifi c endeavors
confidently employ my computations and measures to establish the validity of their results. Look at how famous I am in the wo rld of
subatomic particles. What scientific efficacy does the Living Algorithm have, if her analysis is transitory and her data stre ams individual?"
Rattled and not really understanding, Life responded feebly, "But what about the Living Algorithms Predictive Cloud? It cert ainly provides
a unique and pragmatic perspective regarding individual moments something that even you cant do.
Probability derisively: A Predictive Cloud!? What kind of predictions can be made with a cloud? Sounds ambiguous to me.
Life uncertainly: I might be wrong. But it seemed that the Authors most recent article illustrated that the Living Algorith m's Predictive
Cloud does a good job with the batting average perhaps even better than you when it came to the ongoing games in a player's
career."
With a sense of superiority Probability boasted, "Ptah! I'll grant you that the Living Algorithm's method of analyzing the ba tting average
might be useful to gamblers or coaches, but it has no scientific value and hence no significance. Stick with me, if you want some definitive
answers. Go to her, if you are happy with mere suggestions."
Life was confused again. Questions raced through her mind, over and over again like a broken record. "Are my data streams so transitory and
individual that I must be content with a pragmatic mathematics that has no scientific value? And why is Probability so famous in the
subatomic world? Plus, why doesn't baseballs batting average have any scientific significance?
This is all too confusing. To sort things out or, at least establish priorities, Im going to meditate."
Ommm? Whoa! Recently, Ive been riding a rollercoaster of emotions and all due to my budding relationship with the Living
Algorithm. Maybe I should just remain single to eliminate these psychic disturbances. But that wouldnt help. I seem to crave a
mathematical partner. I had grown dissatisfied with other more traditional mathematical choices for constantly attempting to box
me in - regulate my every move. I decided to look for an alternative a mathematical guide that might help me to realize my
potentials by unlocking the code to my matrix. Friends introduced me to the Living Algorithm, which led to a few successful
interactions. Wanting to take our relationship to another level, I posed some requirements, which the Living Algorithm fulfil led.
Everything was going perfectly. It even seemed that the Living Algorithm and Probability, as complementary systems, might be
able to join together to provide me with a more comprehensive set of clues to my behavior.
"But then Probability challenged the Living Algorithm's scientific credentials," Life fretted. "Perhaps he was jealous of all the attention I was
giving to the Living Algorithm. Perhaps he is right. Could the Living Algorithm just be a poseur - pretending to be a valid system, but
without any real scientific foundation? If her insights have no basis, how can I trust the codes she reveals? What was it tha t Probability said?
Oh right. The Living Algorithm's analysis, while pragmatic, is too transitory and individual for Science. And then when I bro ught up how
much useful information the Living Algorithm provided in the batting average example, Probability just snorted derisively. 'T he batting
average has no scientific significance.' Does Probability have a valid point or is he just attempting to undermine my budding relationship
with the Living Algorithm?"
The Mathematics of Living Systems Page 42

with the Living Algorithm?"


To see how much validity Probability's critique has, read the next article in the stream Living Algorithm Patterns. To continue in the
metaphorical world read on.

CE5-6: Probability's Numbers vs. Living Algorithm Patterns


2: Articles
4. Paragraph

Smiling cockily, Probability summarized the analysis: My batting average has a proven pragmatic value, as evidenced by the f act that it is a
factor in determining both strategy and salary. However, due to its individual nature, it cant be compared to any other set with any kind of
scientific certitude. Accordingly, the batting average has no scientific value. The same analysis applies to the Living Algor ithms predictive
cloud. Data streams are so individual and transient that it is impossible to achieve the certitude that Science requires. I m ust admit that her
predictive cloud characterizes moments much better than I. Nevertheless, her descriptions of moments have no more scientific validity than
my batting average for the same reason. It is impossible to generalize the results to other data sets due to the individual c haracter of the data
we are analyzing.
However, I, Probability, can generalize my analysis of one homogeneous matter set to another. Atomic particles, whether elect rons, atoms, or
molecules, are identical and obey the same universal laws in all times and places. Under the same circumstances, one electron behaves the
same as another no individuality whatsoever. Because of my ability generalize my analysis Science has admitted me to his
exclusive Circle. While the Living Algorithms Predictive Cloud provides potentially pragmatic information regarding future
moments, this alone will never get her into the Science Circle. Because she only deals with transitory moments in individual
data streams there can be no certainty of her predictions. Due to this lack of certitude, the Living Algorithm analysis is of
questionable value, at best. Accordingly Science ignores her analysis.
I, on the other hand, am famous in the scientific community. Anyone who has anything to do with science must have at least a rudimentary
knowledge of my system. In the hard sciences I am world renowned for cracking the code to the electron's matrix. The soft sci ences employ
my skills to establish the validity and significance of their experimental studies. They all worship at my altar. I will even tually solve your
code just like I resolved the code of the subatomic matrix. Your Living Algorithm is unnecessary."
Somewhat taken aback by Probability's arrogance, Life responded quietly, "So you are going to identify all my general charact eristics and
then claim that this is me?"
Understanding the subtext of my comment, Probability angrily blurted, "You just wait and see! I will be able to predict your behavior just
like I do matter. After all, you are just a sack of atoms." After asserting his presumed dominance, he stormed out.
I relayed this story to the Living Algorithm in our next encounter. She laughed so merrily that I joined in. "Probability is getting desperate.
He is overly attached to becoming your mathematics. He needs to detach a bit from these expectations. They create emotional c hains that are
disturbing his internal peace. He is already famous the world over for his achievements. Why does he feel a need to dominate human
behavior? Perhaps it is a way of compensating for a sense of inadequacy over his inability to address the dynamic nature of e xistence.
Let me first address his claim that my method has no scientific value because of the transient and individual nature of the d ata streams that
are my sole obsession. As evidenced in our batting average example, my predictive clouds supply an abundance of practical inf ormation
when applied to living data streams. Experimental evidence suggests the likelihood that Life employs this pragmatic tool, my predictive
clouds, for assessing environmental patterns to best determine the most appropriate response to ensure survival. If Life empl oys my
predictive clouds, then Life is also subject to my information patterns. In Triple Pulse Studies, the first notebook in this series, we examined
many examples of how Life has employed the Triple Pulse, one of my many information patterns, to organize human behavior asso ciated
with sleep. Accordingly my scientific value lies in my ability to reveal the underlying information patterns that motivate be havior. However I
can't establish the scientific certitude of these connections on my own. I require Probabilitys analytical talents to verify , or at least establish
the limits on the correspondences between human behavior and my information patterns. Thank you Probability.
Life was relieved to find out that the Living Algorithm was not a poseur. Her method had scientific merit even though she had to rely on
Probability's services to establish certitude. "What a great team you can be. Probability can provide helpful information abo ut my general
features and assist you in your quest to establish your scientific validity. You, on the other hand, can unlock the code to m y personal
dynamics, the matrix of my behavior."
Alga: "Exactly. We need each other to provide a comprehensive picture. Probability is helpless before your dynamic nature, as his specialty
is static data sets, not dynamic systems. He can establish precise definitions, but no causal mechanisms. At best he can prov ide a rough map
of the landscape. This is very useful because it reveals where you can go and where you can't. But the map reveals very littl e about inner
motivations and potentials the factors that influence and inspire your behavior. That is my specialty, as my sole focus is the
dynamics of data streams."
"Ironically, the story of how Probability became famous as ruler of the subatomic world illustrates both his inherent strengt hs and
weaknesses. Further it pertains to why my dynamic nature is ideally suited to determining causality, while his static nature is more suited to
description. As with other aspects of our respective systems, these talents are mutually exclusive. Read on to see how Probab ility was able to
patch up the gaps in the subatomic universe that were left by classical Mechanics, the star of Physics. In so doing, Probabil ity became the
new star - both technically and philosophically for a while at least. Fame is always so fleeting."
To explore these issues, read the next article in the stream Description vs. Causality; Static vs. Dynamics. To continue in the metaphorical
world read Life yearns for Mathematics of Relationship.

The Mathematics of Living Systems Page 43

A Trio of Mathematical Perspectives to deal with Lifes Complexity


29 November 2015
08:50 AM

A Trio of Mathematical Perspectives to deal with Lifes Complexity


Life puts down the article on Causation. Wow! So many ideas are bubbling through my mind. I must get together with Alga so that I can
symbolize the concepts in my own words. A conversation will help me to assimilate and digest what Ive learned.
After the necessary texting:
Alga: Greetings friend. I hope that the Authors exposition addressed Probabilitys objections, and cleared my good name. Smiling.
Life: It certainly did. The article was an amazing tour de force. If you dont mind, I would like to encapsulate the history, so as to cement
the ideas.
Alga: Certainly. Great idea.
Life: OK. Here goes. Mechanics, i.e. traditional Newtonian Physics, does an incredible job of employing his continuous equations
(epitomized by Newtons F=ma & Einsteins E =mc2) to characterize the dynamics of matter. However, as the elemental particles become
smaller and smaller, his computations become more and more prohibitive. Calculating the motions of the eternal planets, while
complicated, was simple compared to calculating the trajectories of billions upon billions of colliding atoms. While retaining their vision of
tiny particles interacting, Physics engaged Probability to do the computations. The mathematics of both systems miraculously yielded the
same results.
Although Mechanics and Probability were totally compatible computationally, their theoretical implications are as far apart as night and
day. As mentioned, the traditional continuous equations of Mechanics implied that space and time are unbroken and continuous. Further the
universe is composed of matter that moves through this space and time in a fashion that could be predicted by these equations. The world of
atoms was so very well behaved that Physics believed that he was on the verge of completely understanding the universe. He had employed
Probabilitys computational tools, not his religion.
But when hard data concerning the subatomic world started pouring in, Physics had to change his tune. His continuous equations, which
worked perfectly well with atoms, had some holes that needed patching. It was as if his continuous equations, which defined the behavior
of atoms almost perfectly, could only see one facet of the subatomic world at a time. Specifically Physics could see electrons and photons
of light as fixed particles or as moving waves, but not as particles moving through space, as he had previously. Only Probability could
bridge this gap. But this introduced uncertainty where certainty had been.
The squawking was extreme in the beginning. Frequently the cognitive dissonance of outrage and laughter is the result when two
orthogonal matrices intersect for the 1st time. But after a generation of scientists passed on, the entire scientific community became
comfortable with this fertile new perspective. It is now taught as scientific fact in colleges all over the world.
Symbolizing the material world requires two different mathematical systems, not just one. The certainty of Mechanics rules the atomic
world on up, while the uncertainty of Probability rules the subatomic world. Mechanics reveals the dynamics of matter, while Probability
reveals the probable position. Modern Physics now becomes the merger of Probability and Mechanics.
The dual mathematical system of Physics has successfully characterized the entire material world, both atomic and subatomic. Despite his
assertions to the contrary, this dual mathematical system has not been able to address the unique features of material systems that are alive,
i.e. me Life. That seems to be your job.
Alga: Precisely. But instead of revealing precise behavior (position) with my mathematics, I reveal the dynamic information patterns that
influence human behavior. Further I require Probabilitys talents to verify these patterns of correspondence. In other words, Probability is
the ultimate judge as to whether my predictions have any scientific validity.
Life: I really liked the Authors suggestion that all three of you, a trio of mathematical systems, are required to address the unique features
of my living matter.
Alga: Me too. Your presence in the Universe introduces another level of complexity that is inaccessible to the purely material perspective
that Physics provides.
Life: Thanks for listening to my spew. This dialogue was very helpful. But my Pulse is fading fast. Time for a Rest Pulse to refresh my
cognitive abilities. See you soon.

Life yearns for Mathematics of Relationship


After this conversation Life felt reassured about the Living Algorithms scientific credentials. She was thrilled about how her relationship
with the Living Algorithm was developing. It seemed too good to be true. She sometimes even pinched herself to make sure she's not
dreaming. She had become so discouraged, almost giving up hope, of ever finding a mathematical guide.
As the types of mathematics had proliferated over the millennia, she had grown hopeful that there might be a mathematics out there
somewhere, who could assist her in her quest for personal understanding. "Everyone else has their own math, why not me?" she queried a
friend. "The philosophical Greeks applied their math to Earth's space, which eventually extended into the Heavens. The mystic Indian
Hindus, of course, discovered Nothing, or zero. Then the sober Muslim mathematicians specialized in static symmetries, manipulating
abstractions in incredible ways. Finally, the exploratory Europeans moved into the world of dynamics.
That is when I had become so excited at long last, a math that deals with motion and change. My time has come at last. You
see, although I exist in space and time, I am primarily a dynamic entity. Not like the stars with their regular orbits that last for
eons, but constantly adjusting to context. A math that doesn't address my changeable nature doesn't really understand me.
That is why Probability doesn't interest me so much. Don't get me wrong; I really appreciate what he does for me. He provides
me with a plethora of information about my static features. This is very useful, but that is not what makes me tick.
I am all about immediacy, relationships, and choice. I exist in the Moment, nowhere else. But my Moment is not instantaneous, but instead
charged with memory and expectation, even an emotional momentum that propels me forward. Further, I only exist in connection to other
The Mathematics of Living Systems Page 44

charged with memory and expectation, even an emotional momentum that propels me forward. Further, I only exist in connection to other
living beings. We cooperate with each other in order to survive. This relationship is essential if any of my infinite transformations are to
persist. And, of equal importance, we make choices to facilitate survival. The good choices are rewarded and the bad choices are punished.
Most of these choices verge on automatic, but recently a new strain has developed with a sophisticated sense of the ability to make
conscious decisions. This is why I want my own mathematics.
That is also why Physics is such a disappointment. As said, I was so excited when those Europeans Galileo, Newton, and such, got
into dynamics. Finally a mathematics that addresses Change, the essence of my Being. However, my initial infatuation with the
mathematics of Physics was eventually replaced by despondency. It became evident, pretty early on (maybe with Descartes)
that Physics were primarily interested in the automatic behavior of matter - planets, atoms and such. Perhaps I was in a state
of denial, but I had secretly hoped that somehow his focus on change would eventually lead to me.
Boy, was I deluded. Instead Physics had the gall to say that I was only made of matter and had no real say in what my next move was. He
denied my unique ability to choose - to make informed decisions about my future. That was the very reason I wanted my own mathematics,
so s(he) could help me to make better decisions about what to do next. My dearest hope was that this mathematics would assist me in my
quest for Self-Actualization. That is my only real desire - to fulfill my potentials. And then Physics arrogantly claims that Choice is an
illusion. According to him, even Human Behavior is solely determined by the collisions of subatomic wave/particles that go backward and
forward in time. My hopes for a compatible relationship were dashed. That is when I decided to look elsewhere.
I almost gave up hope of ever finding a Mathematical Guide someone who could assist me on my difficult quest. Many of my
friends told me to give up this impossible dream. This mathematics doesn't exist. You will never be compatible with his
numbers, they claimed. Youre too spontaneous; hes too rigid.
And then along came the Living Algorithm. Certainly not much to look at. Our first date was casual. I didn't have much hope. Amazingly
we agreed almost completely on the Interruption Phenomenon. We didn't go on our second date for quite awhile. We didn't really know
what else we had in common. After the Living Algorithm matured a little, we had a great series of interactions concerning the Triple Pulse's
relation to sleep-related phenomena. Again it seemed that the codes of our matrices seemed to be compatible. We agreed on everything.
Trying not to get too excited, I wrote down a series of requirements that I had for a mathematical partner or guide. First and foremost, the
mathematics needed to address the immediacy of data streams. I was excited to find that the Living Algorithm is all about Immediacy. Her
specialty is digesting data streams - turning instants into moments. But I need a Mathematics that also addresses relationships between
moments in time. Although I exist in the moment, I am connected with the past and have a sense of future potentials. Does the Living
Algorithm relate moments together, and if so, do they have an effect on each other?
Some of my friends say that I am being too hard on the Living Algorithm making too many demands. But I am special, in this wide
universe of ours, and have specific needs that must be fulfilled before it is worth it for me to form a relationship with any
mathematics. Is the Living Algorithm the mathematics of my dreams the mathematics of relationships as well as the
mathematics of the moment?"
To find out if the Living Algorithm can fulfill Lifes ideal needs read The Mathematics of Relationship. To see how this interaction turned
out, read on.
Ads by DNSUnlockerAd Options

CE7-8: Can the Living Algorithm provide Life with Fungible Meaning?
2: Articles
4. Paragraph

Life was exhilarated to find that the Living Algorithm created a system where moments interacted with each other, where what happened in
the past had an effect upon the future. She thought to herself hopefully, "As our relationship develops, it seems that the Living Algorithm
and I have exceedingly similar matrices. Is she the mathematical guide of my dreams? Before claiming this title the mathematics of the
Living Algorithm must somehow deal with fungible meaning in the sense of sublimating precise detail for the larger picture. The
others worship precision and, as such, miss essential meaning. Alga must be in my camp, not theirs, if she is to be my method
of digesting environmental input."
We have had many successful interactions. Our theories and experiences correspond almost completely on certain issues - including
interruptions to a creative session and many sleep-related phenomena. However, these could be only superficial similarities, like enjoying
the same music. To move to a deeper level in our relationship, I have required that the Living Algorithm pass four ordeals.
These ordeals did not include dragon-slaying, maiden-rescuing, or tyrant-overthrowing. Nothing aggressive like that. Instead the Living
Algorithm must incorporate Immediacy, Relationship, Fungible Meaning, and Choice in her system. She has passed the first two ordeals
(Immediacy & Relationship). Can she pass the third ordeal - provide Fungible Meaning? This means she would have to abandon the
precision that is a trademark of her trade.
Many say that my demands are impossible, claiming that mathematics and lack of precision are antithetical. Some who are sympathetic to
the Living Algorithm have even attempted to persuade Life to lower her standards. "Be happy that you are compatible in so many ways and
get along so well. No mathematics can possibly be perfect. Don't dismiss her as a Mathematical Guide just because she can't deal with
fungibility. That is an impossible task for any mathematics."
"I'm sorry," Life responds, "but the biological systems that I embody require a fungible interpretation of environmental input in order to
survive. We have sacrificed precision for meaning so that we can recognize pattern. This ambiguity of interpretation enables us to identify
familiar objects and processes in unusual contexts and from peculiar perspectives. I demand the same from my mathematics. If the Living
Algorithm can't incorporate fungibility into her process, how will she ever be able to understand me well enough to give me any guidance?"
To see if the Living Algorithm can pass this next, seemingly impossible, ordeal, read the next article in the stream Precision vs. Fungible
Meaning. To find out how the encounter went, read on.

The Mathematics of Living Systems Page 45

CE8-9: Could Life employ Living Algorithm to digest Data Streams?


2: Articles
4. Paragraph

Life: I am thrilled that you, the amazing Living Algorithm, have been able to pass my third ordeal - providing a fungible interpretive
mechanism.
Living Algorithm: It wasnt difficult; After all fungible is my middle name. When applied to biological systems the word fungible has to
do with sublimating detail for meaning. In my digestion process I immediately trade in the precision of the instant (the data point) for the
meaning of the moment (an ongoing fungible average). These fungible averages, my Predictive Clouds, characterize the meaning of the
moment in relationship to what went before. My Clouds provide the foundation of the rough approximations necessary for the meaning
making of pattern recognition.
Life: Although it is evident that we are compatible in so many ways, you still have to pass one more ordeal. Your system must incorporate
the possibility of informed choice.
Alga: Of course. I think you will not find me wanting in this department. But lets relish the moment of our present triumph rather than
getting lost in future tasks.
Life: We get along so well and have so many features is common. Could it be that the living systems that I embody employ you to digest
data streams?
Alga: It could be. In the attempt to remain alive, you are continually attempting to understand the meaning of the moment. Accordingly,
you need some type of interpretative mechanism to facilitate this task. Coincidentally, perhaps not, my entire focus is upon defining the
meaning of the moment, nothing else. My measures, the Predictive Clouds, are computed by taking into account ongoing data that is
weighted in proximity to the present moment. My digestion process, then computes the trajectories of the ongoing relationships between
moments. These ongoing relationships characterize the potential meaning of the data stream. As such, my Clouds provide an interpretative
mechanism that could be employed to reveal the patterns that are at the heart of meaning. I think you would find the information very
useful. Due to my singular obsession with characterizing the Meaning of the Moment and your need for this interpretative mechanism, I
think it is very likely that you employ me, or a mathematics very much like me, to digest data streams. Gotta get going. Have another
appointment. See you next time.
Life: Whoa! Could it be that I have found the data stream mathematics of my dreams. Alga, via her clouds, is certainly sensitive to the
ongoing, changeable, and immediate nature of the data streams that define my existence. Plus her Clouds provide a plausible meaning
making mechanism that I could certainly use. While this argument makes logical sense, the graveyard of science is filled with ideas that
made lots of sense. Aristotles system of understanding dominated western thinking for about two millennia because it made lots of sense.
Yet the system is discredited because experimental evidence contradicted Aristotles theories. Theories must be tested against the facts of
empirical reality to establish their validity."
"Is there any evidence that I employ the Living Algorithm to digest the data streams that define my existence? Hmmm? There are distinct
patterns of correspondence between the Living Algorithm's mathematical behavior and my human behavior regarding multiple sleep-related
phenomena. Further I have distinct requirements for a data stream mathematics that will fulfill my needs. Thus far, the Living Algorithm
System has fulfilled all of those requirements.
"Suppose that I do employ the Living Algorithm to digest numerical data. What about the abundance of information flows that cant be
assigned a distinct number? For instance, what about the relative terms that are so useful for organizing our world, such as lighter, bigger,
smaller, or smarter? How well does the Living Algorithm fare with non-numerical entities?
"But I cant think clearly anymore. My Pulse of Attention is fading fast. Have absorbed so much new information. Time for a Rest Pulse to
provide my Liminals an opportunity to integrate the information."
To see if the Living Algorithm's digestive process can deal with relative terms, read The Living Algorithm Algorithm.
To remain in our metaphorical world, read on.

CE9-10 Comfortable with Living Algorithm's Algorithm, Life wonders about Choice
2: Articles
4. Paragraph

We have come a long way. As we began this tome, Life was looking for a mathematical partner, who would be sensitive to her unique and
subtle features a Mathematics of the Moment. Probability's preoccupation with her general features prohibited him from
fulfilling this role. While predictable, dependable, and even comfortable, Probability's limited understanding was not sufficient to
deal with Life's spontaneity. In fact he was continually attempting to box her in with his certainty, even claiming that her
'wildness' was just an aberration. 'Statistically insignificant' was the phrase he regularly applied to her behavior, when it strayed
from the norm. Due to the dysfunctional nature of their relationship, many of her friends even speculated that Life would never
find a mathematics that she could be happy with. "Math is just too rigid, precise, and automatic. Life needs to be free. After all
she is an Artist. She can't be boxed in by math's rigid forms."
Life had almost given up, when along came the Living Algorithm. It was not exactly love at first sight between these two unlikely partners.
Small and unassuming, the Living Algorithm was nothing to look at. One would never even notice her in a crowd of equations. Her
operations are basic and elements few. Certainly nothing noteworthy. In fact, Life originally mistook her for one of Probability's many
equations peremptorily dismissing her from the running for mathematical partner. But when Life saw the Living Algorithm in
action everything changed.
Although the Living Algorithm is not much to look at, her offspring are spectacular. She mates with Data Streams to produce an Info
System. Amazingly enough, the Living Algorithm's Info System specializes in characterizing the moment - a possibility that Life had
almost given up on. Life also relished in the Living Algorithm's dragon-like flexibility of interpretation, as she had long since tired of
The Mathematics of Living Systems Page 46

almost given up on. Life also relished in the Living Algorithm's dragon-like flexibility of interpretation, as she had long since tired of
Probability's know-it-all rigidity. Further, the Info System includes the Living Algorithm's Family of Measures along with their myriad
manifestations including the Creative Pulse and the Triple Pulse. As well as being gorgeous, these two are sensitive to context
and relationship. This was particularly attractive to Life, as she is also all about context and relationship. To be honest,
Probability is particularly inept at understanding these two aspects of her personality.
Life: After our many successful interactions which culminated with the Biology of Sleep, I began wondering why we get along so well.
Alga suggested that she might be part of my operating system. To test this theory Alga asked that I state what I needed from a mathematical
system. I posed some requirements and ordeals, all of which the Alga easily passed. It turns out that the Living Algorithm specializes in
describing the relationship between moments providing meaning to the data stream of instants that are continually bombarding
me.
I could see that Alga could deal with numbers, but I wondered how she would do with the relative values that characterize my existence
a little more, a lot less, etcetera. It turns out that the Living Algorithm algorithm can easily handle relative terms that are nonnumerical. The computations are easy and the memory requirements are minimal. She has no need of an extensive database,
like the kind Probability requires. Instead she generates measures that are emotionally charged due to the fact that they are
associated with future expectations. It turns out that I can more easily remember things that are emotionally tagged. It pays for
me to remember the values behind Algas Predictive Cloud because they indicate expected position, range of variation, and
recent tendencies of the data streams next value. It seems that the Living Algorithms algorithm handles both verbal &
numerical data equally well.
Our relationship is proceeding so smoothly, even seamlessly. Our complete compatibility borders upon the miraculous. My attraction to
the Living Algorithm and her System is increasing by leaps and bounds. Alga has fulfilled most of my requirements and passed my ordeals.
Could the Living Algorithm really be part of my operating system?
There was still one feature that had not yet been worked out. I, Life, was still curious about how the Living Algorithm System felt about
choice. . This topic is especially dear to me, in fact a relationship breaker. You see, one of my other mathematical suitors, Physics, had
bluntly told me that choice was just an illusion. He even claimed that his equations were on the verge of completely predicting my every
move. "Just give me your initial conditions and I will tell you everything that is going to happen,' he bragged. Not wanting to ever be boxed
in so completely, I dumped him. But I never forgot his comment about choice. It bugged me. Plus Probability, while a mite more flexible,
had made a similar suggestion. In the midst of one of our many arguments about spontaneity, he even said to me: "You consist solely of
subatomic particles, nothing else. I can accurately predict the behavior of these sub-atomics. By straightforward logic, I can therefore
predict your behavior." While he couldn't predict my behavior currently, I wondered if he might be able to at some later time. His statement
placed more than a little doubt in my mind.
To see how this issue between Life and the Living Algorithm is resolved check out the next article in the series The Mathematics of
Informed Choice.
To remain in the metaphorical world read Probability challenges Living Algorithm's scientific credentials.

CE10-Dyn1. Digestible Information & the Living Algorithm's birth


Life digests Living Algorithm's Digital Info, not Physics' Continuous Stream
2: Articles
4. Paragraph

In her quest to find a mathematical partner who could provide some guidance, Life found the Living Algorithm. They immediately bonded
due to their common obsession with data streams. However, Life didn't want just any old data stream mathematics. She wanted a data
stream mathematics that could incorporate immediacy, relationships and the potential for informed choice. After their most recent
encounter, Life was satisfied that the Living Algorithm had fulfilled all of her requirements, including choice.
In contrast, Physics denies the possibility of choice due to his absolute obsession with Lifes material nature. Physics prides himself on his
continuous equations. These continuous equations have an extraordinary power for describing the behavior of matter. Due to Physics
obsession with his continuous equations, it should come as no surprise that he would attempt to use them to characterize Life. Despite an
abundance of evidence to the contrary, Physics makes the logical inference that Life is subject to the same mindless automatic laws as
Matter. These automatic laws lead to an equally automatic future where everything is predetermined by the interactions of Matter.
Similarly, it should come as no surprise that Life, whose very survival is based in her ability to make informed choices, would instead
embrace the Living Algorithm as her mathematical partner.
Life: I don't blame Physics for being obsessed with his equations. His equations got him where he is today. In fact, this power was why I
was so attracted to him initially. But then when I got to know him, all he talked about was dirt and dust - atoms and molecules. He even
tried to apply his equations to me. I felt he was totally objectifying me, neglecting my special features.
I try not to judge Physics for his myopia, for I know where he has come from. For millennia, he and his type believed that even matter was
alive, or at least had life-like properties. Boy, were they confused. However in trying to distance themselves from this animistic perspective,
the philosopher scientists had an opposite and equal reaction. From everything being animate, they decided that nothing was animate.
The collective effort of Galileo, Newton, Einstein, et al. established conclusively that the entire material universe, the stars as well as the
Earth, obeyed universal laws of motion. Intoxicated with this success in the material world, Physics generalized his findings to my
biological world of living organisms. "I can predict the world of matter with nearly absolute accuracy. You, Life, are composed of matter,
and as such, with insight, I will also be able to accurately predict your behavior."
Offended, I responded bluntly:"Take your continuous equations and go to that cold, automatic matter that obeys your every command. I
need a different kind of mathematics with a different kind of equation." Obviously Physics had completely objectified me, neglecting all the
qualities that make me special.
The Mathematics of Living Systems Page 47

qualities that make me special.


Life: "I am different than Matter in so many ways. I'm amazed Physics can't see our differences. I have the capacity for self-reflection. This
ability enables me to consider the past when making decisions about the future. Matter, responds automatically to environmental stimuli
and has no capacity for memory or self-reflection. Further, I continually regenerate myself. That is what it is to be alive. Matter goes
through transformations but doesn't regenerate itself. These are crucial differences, and I can't believe Physics doesn't see them. I guess it's
because he views everything through the filter of his continuous equations that precisely determine the behavior of matter. I am sick and
tired of him trying to force me into his automatic little boxes that he claims hold everything. Guess what, not me. I'm outta here."
For a long time Life thought she was doomed to mathematical isolation. But then along came the Living Algorithm. "Not only does the
Living Algorithm support regeneration, she is also in favor of self-reflection. As soon as we met, we became best friends. Instead of forcing
me into some conception of whom she thinks I should be (like Physics, that jerk), she is sensitive to my special features. The continuous
equations of Physics, which are so effective in dealing with Matters innate automatic nature, are helpless before my regenerative,
immediate nature. This is my best friend's realm.
How is it that the basic math of the Living Algorithm can characterize me so well? Physics, on the other hand, with all his complex mathemagical equations is still unable to accomplish this. It has to do with the Living Algorithm's method, her tao. For one, she digests data
streams, just like I do. Physics does not. He only formulates laws. The Living Algorithm is like a Cuisinart, turning information into a more
easily digestible form. Physics just creates models of reality, like a toy train set. He then tries to force me inside. He is so pedantic, always
wanting to be right. In contrast, the Living Algorithm, while not having the bells and whistles, is streamlined and incredibly useful. She
doesn't attempt to just describe my data streams, she helps me to understand the meaning behind them."
The equations of Physics create a continuous physical universe. Yet a continuous flow of information gives me indigestion. I, Life, only
digest data streams that come in discrete digitized bytes. It is ideal that both a photon of light and an electron come in discrete quantized
chunks, as this is the most digestible form for me."
"A digital flow of information enables me, whether Im in the form of an amoeba or a human, to respond to environmental stimuli in a
timely fashion. To survive, I require a give-and-take relationship with the world around me. Even if the information from my surroundings
comes continuously, I must break it into parts so that I can digest it and respond appropriately. The continuous alternation of stimulus and
response enables organisms, such as me, to maintain balance, temperature control, et al. In contrast, the equations of Physics are
continuous, not digitized. Accordingly, there is no room to monitor and adjust to circumstances. These physical equations have no need for
a digital, give-and-take mechanism, as they were derived to deal with matter's automatic and eternal response to environmental stimuli.
Marvelously, the Living Algorithm specializes in digesting digital data streams. We have so much in common. Due to my innate nature as a
biological system that is constantly digesting information, the Living Algorithm realizes that I, Life must digest her information in smaller
chunks to make it easier to assimilate. This is another reason we get along so well.
There are just a few more question that have been bugging me. I think Ill give Alga a call and see if we can get together.

The Living Algorithm based upon the Natural Feedback Loop


After the normal pleasantries, greetings, hugs and such, were exchanged, Life: I am impressed by the patterns of correspondence between
your mathematical behavior and my human behavior. I am also pleased that you were able to fulfill all of my requirements for a
mathematics of living data streams. But how did you and your equation come into being?
Living Algorithm: My algorithm arises spontaneously whenever certain basic conditions occur. These basic conditions are innate to living
systems. They are related to the interactive feedback loop between the organic forms you embody and their environment. There is a slight
lag time between each observation and the potential for response. During this slight lag between environmental data points, the original
sensory input decays, as it is overlaid with the next burst of sensory input.
A camera that films the screen that it is projecting upon simulates this feedback loop. The intricacies of the patterns generated from simple
shadows are remarkable. The camera is the Living Algorithm, while the image on the screen is a picture of the most recent moment in this
visual data stream.
Note that the original input is not eliminated from the screen; it just fades. Further, the new input does not come in at 100% capacity, as
this would eliminate the useful information from previous input. In essence, the trajectories or characteristics of the most recent moment
equal the impact of the most recent instant plus the faded prior moment. This process is the essence of my algorithm. As I said my process
arises naturally whenever there is an interactive feedback loop between living system and the environment. Does that make sense?
Life: I dont know about the details. But in an overall sense, you seem to be saying that your algorithm describes the inherent process
behind an interactive feedback loop. Further all the living systems that I embody are constantly involved in an interactive relationship (a
feedback loop) with the environment in order to survive or fulfill our potentials. Does this analysis imply that concurrent with the very
beginnings my life forms was your, the Living Algorithm algorithm? Is it equally possible that as the cognitive abilities of my life forms
evolved that the increasing complexity of their neural networks eventually allowed the organism to take advantage of other inherent
features of your algorithms method of digesting information?
Living Algorithm: Go girl! Great articulation of questions that I have never entertained. It is certainly plausible that the living systems that
you symbolize were able to tap into increasingly complex features of my digestion system as their neural networks evolved in complexity.
As for the beginnings of your life forms, I cant think of any manner in which living systems could engage in an interactive feedback
without my algorithm, or something very much like it.
Life: Just one last question. Why are there patterns of correspondence between your Triple Pulse and my behavior? In other words, what is
the causal mechanism that links you and me?
Alga: Ah, the dynamics of causality. This is a complex question that requires an extended answer. In fact, it takes an entire volume Data
Stream Dynamics to provide this answer. Check out the first article Living Algorithm Algebra. It is an introduction to my static
side.
Life: Whew! I think it is time to take a break. Our Pulse is fading fast. My Liminals are demanding some down time to assimilate and
The Mathematics of Living Systems Page 48

Life: Whew! I think it is time to take a break. Our Pulse is fading fast. My Liminals are demanding some down time to assimilate and
integrate this new material.
Alga: Good point. Thanks for your attention. See you next time.

The Mathematics of Living Systems Page 49

You might also like