This page contains the summaries of the second complex of articles concerning Information Dynamics - theory, philosophy, justifications, applications and such.
The Living Algorithm is the core equation of regenerative mathematics upon which Information Dynamics is founded. As such she is a defining feature of BD. However, understanding BD does not require a knowledge of the Living Algorithm.
The Living Algorithm, the Mother of BD’s equations, epitomizes elegant simplicity. She may be the simplest equation for decay and regeneration. Her operations don’t exceed division and her factors are few – just central measures and data. There is only one constant and the Experimenter/Organism chooses this to determine the appropriate rate of decay. The rest is automatic.
The Living Algorithm digest live data streams, rather than dead datat sets, just like Life. The Living Algorithm is self-referential and regenerative, just like Life. The Living Algorithm’s self-referential nature leads to incredible complexity, required of Life. Her measures decay in a spiral fashion, just like Life – with the most recent having the most impact and the rest fading with distance from the present. Further the incoming Data is digitized, necessarily coming in discrete chunks – just like Life with her continual monitor/adjust relation to the environment. Due to these many similarities the Living Algorithm and Life have an ideal partnership.
The Living Algorithm and the Cell have many links. Both digest information. Just as the cells essential for life convert environmental input into a useable form, the Living Algorithm converts numerical information into a useable form. Further both have an intimate connection to living systems, and a potential connection to evolution. It is even conceivable that living systems have evolved to take advantage of the simple mathematical mechanisms of this system.
As a processor of a stream of data, the Living Algorithm manifests her Nature through time. As such, she is best studied through graphs, which reveal her dynamic nature. In fact our visual processors are so incredible that this is a preferred means of interpretation.
The Living Algorithm is the velocity of the data stream, the flow of numerical information. The first derivatives of the Living Algorithm, the Deviation and the Directional, are accelerations. The Deviation is an undirected scalar, while the Directional is directed vector.
The Living Algorithm was discovered in 1978. Sixteen years later, in 1994, the Deviation and Directional were discovered. At the time the Deviation was used in the DSM Project, a branch of Information Dynamics. Eight years later, in 2002, the Directional was employed in the Creative Pulse Project, the other branch of BD.
The Directional is a self-referential, directed acceleration. Due her nature her manifestions are incredibly complex.
This article has primarily referred to the individual Living Algorithm, whose function is to generate the Living Average. However, the general Living Algorithm computes the Living Average, plus the extended families of the Directionals and Deviations. Therein lies its informational, biological, and evolutionary potentials. The Living Algorithm also collapses into the traditional mean average – another indication of her universality.
Check out Living Algorithm's Evolutionary Potentials if you are interested in her possible link with senses and emotions. If you are interested in a contrast between the context-based Living Algorithm and traditional context-based equations check out Living Algorithm vs. Traditional Equations. If you are interested in the manifestations of the Living Algorithm, check out her two projects, Data Stream Momentum and Creative Pulse, and their experiments, the Triple Pulse Experiment, the Creative Pulse Notebook, and the Boredom Principle.
This article examines some of the differences between the the Living Algorithm and the traditional equations of Newtonian mechanics, the central measures of Statistics, and the entropic perspective of thermodynamics.
In the Living Algorithm we argued that both Life and the Living Algorithm are both self-referential and regenerative, while the equations of Newtonian mechanics are neither. Due to her stimulus/response relation to the environment, Life digests Data is a digital fashion, as does the Living Algorithm. In contrast Newtonian equations are continuous due to matter’s automatic response to environmental stimuli. Due to the give-and-take nature of her relationship, Life requires at least two interactive data streams. The speed of the feedback loops between these two data streams is an evolutionary factor – the quicker the better the chance of survival. ‘Having an agenda’ slows the feedback loop due to a tendency to over-invest in results. The Person is not light enough to respond rapidly to changing external circumstances. This is true of the martial arts, art, and relationships.
Matter’s hard sciences require hard data to develop their precise equations. These hard equations make exact predictions about the behavior of dead matter. In contrast, the Living Algorithm does not require hard data, as she processes an ongoing flow of information. She transforms raw data into a multiplicity of central measures (averages). This averaging process eliminates the inherent precision of the hard data. The Living Algorithm’s averages have powerful probabilistic predictive abilities that are ideal for Life. The fungible nature of the predictions lends itself particularly well to the spontaneity of living systems. In fact, the hard predictions of matter science are so narrow that they restrict the possibility of a flexible response to ever-changing environmental conditions. The traditional central measures of Statistics, like the mean average, also make probabilistic predictions. But their predictions only concern static data sets, not the dynamic data streams that contextual Life is required to deal with in a continuous, ongoing fashion. As such the Living Algorithm’s soft predictive powers are ideal for Life.
Another feature that the Living Algorithm shares with Life is that the arrow of Time only points in one direction due to their decay/regenerative natures. In contrast, time is reversible for the precise equations of mechanics. Due to entropy the arrow of time points in only one direction for the probabilistic equations of thermodynamics. However Life and the Living Algorithm have a regenerative component that is missing in any material system – including thermodynamics. This regenerative component leads to an innate organizational force that is lacking in the matter sciences. In contrast to the neutral space/ time of inert matter the regenerative component of the Living Algorithm/Life leads to a space/time that is charged with energy.
Check out the Living Algorithm, if you are interested in her features. For her potentials, check out Evolutionary Potentials. For manifestations of the Living Algorithm, check out her two projects, the Data Stream Momentum and the Creative Pulse, and their experiments, the Triple Pulse Experiment, the Creative Pulse Notebook, and the Boredom Principle.
We’ve argued in previous articles that because of the many similarities between the Living Algorithm and Life, that the Living Algorithm is uniquely suited to model living systems. This article takes that line of reasoning a step further, maintaining that the Living Algorithm, or something very much like it, is Life's most basic information processor. In a future article we will argue that the actual forms (the grammar) of Living Algorithm actually provided Life with the essential computational structures needed to develop the complexity required for the evo-emergence of humans.
This section argues that knowledge of the mathematical features of the ongoing flow of data provides an essential evolutionary advantage to any living system. All, but the simplest, living organisms need some kind of predictive power to both maximize the impact of their response to environmental stimuli, and simultaneously minimize energy expenditure. The predator/prey is an obvious example. A self-referential, regenerative, simply calculated function, such as the Living Algorithm, provides this predictive power. As a processor of a flow of information the Living Algorithm generates a trio of ongoing measures that provide information as to probable position, as well as range and direction of motion of any data stream. This invaluable information is crucial for predicting environmental stimulus and appropriate internal response.
In addition to predictive power the Living Algorithm’s trio of central measures provide the computational backdrop for all of the senses, including the crucial sense of acceleration. In addition to conferring the ability to sense environmental forces, the sense of acceleration enables the organism to differentiate foreground from background. However, a prerequisite to sensing acceleration is the ability to quantify the accelerational component of the incoming message. The Living Algorithm provides this essential talent.
While quantification is a prerequisite for acceleration, sustained Attention is a necessary prerequisite for the ability to sense. Automatic calculations make simple evolutionary predictions that trigger an automatic response from plants & stomachs. However digested sensory input only makes sense over time. For instance differentiating random sensory noise from organized information requires a sense of time – the context. As such a sensor is required to sense the contextual nature of the sensory signals. And if the accumulated sensory data is stored as memory, something must exist to retrieve this information. We will call this uber-sense Attention. Because of this co-dependent relation we suggest that there was an essential co-evolution/emergence of Senses and sustained Attention.
What is it that captures and focuses Attention? Acceleration. This feature differentiates an orderly from a random signal. Let us suppose that the Living Algorithm’s Directional is the measure of acceleration that Attention is drawn to. A graph of this acceleration shows that its magnitude (intensity?) manifests as a natural Pulse that rises and fades over time. The Triple Pulse Experiment reveals that the shift of Attention to a new modality is required to revive the fading Attention. This solution is not physical, biological or psychological. Instead, it is simply a mathematical by-product of a potential method by which living systems process information.
Notice the parallels between the digestion of information and food. Digestion time is required to process info and food. Could it be that it is just as important to feed our biological Information Processor with Info as it is to provide the stomach with food. This line of reasoning suggests that information processing and food digesting co-evolved, with information leading.
If you are interested in more evolutionary potentials of the Living Algorithm, check out Drives & Emotions. The Triple Pulse Experiment and the Creative Pulse Notebook explore some of the inherent forms of acceleration that are foundational for the digestion argument.
In the article, Newtonian Constructs Exploded, we claimed that Information Dynamics expands the notion of Newtonian constructs to effectively bridge matter and life. This article attempts to support this claim.
By rewriting the Living Algorithm in another from, we illustrate that the Living Average is the contextual velocity of a data stream. By rewriting the Equation for the Directional in another from, we illustrate that she is the contextual acceleration of a data stream. In the discussion we illustrate that velocity is speed. As such velocity is a measure of how fast something changes location (the rate of change). In contrast acceleration is the change of velocity (the rate of change of a rate of change).
What is the difference between traditional velocities and contextual velocities of data streams?
Traditional velocities and accelerations are rates of change – change of location over time – how far objects travel compared with how long it took. The velocities and accelerations of data stream are measures of how much information is transmitted over the time between successive data points.
Do velocity and acceleration have any relevance to BD Theory?
Major. First, identifying acceleration is the prime method for differentiating random from ordered – noise from meaning. The Directional (acceleration) of a random data stream is insignificant, while the Directional of an ordered data stream is significant. Second, according to BD Theory Attention (informational mass) is attracted to acceleration, which generates a force. If attention is sustained over time (informational space), power is generated. And this power is employed to do the work of organizing the mind (memory traces). These pulses (based upon the combination of attention and informational acceleration) are the root of the intrinsic motivation (the third drive documented in Drive by Pink) and the optimal experiences (documented in Flow by Csikszentmihalyi).
You’ve established that the velocity and acceleration of data streams are very important to BD Theory, but what are these ‘exploded Newtonian constructs’ that ‘bridge matter and life’?
Humans process information to survive. One way that we organize this processed information creates the 3D world of matter, space and time in which we live. There are other ways that we organize information. As such, information processing is the bridge that joins matter and life.
We claim elsewhere that the Living Algorithm is a primary method by which living systems process data. In the internal information world of the Living Algorithm, the fundamental concepts of space, time, and mass are subjective and variable, while in the external material world of Newtonian physics these same elements are objective and fixed. These are the ‘exploded Newtonian constructs’.
You claim that space, matter, and time are objective and fixed in matter’s external world, while these three basic concepts are subjective and variable in information’s internal world. Your bridge seems flimsy – with little or no efficacy. If these crucial constructs have such disparities why compare them at all?
Newton. If a system consists of mass, space, and time, exploded or traditional, then they can employ Newton’s composite relational constructs of momentum, force, power, and work. And Newton’s momentum, force, power, and work are at the heart of BD’s Theory of Information, just as they are the heart of Newton’s Theory of Matter. Life is first and foremost an information processor. This is how ‘exploded Newtonian constructs bridge the worlds of matter and life.’
Following are a list of related articles. Living Algorithm's Evolutionary Potentials illustrates how this elementary processor generates a trio of powerful predictors with a possible link to the evolution of the senses. Living Algorithm explores the intimate connection between this simple function and living systems. Living Algorithm vs. Traditional Equations illustrates more life-based features of the Living Algorithm by comparing her with traditional equations. Of course, the Living Algorithm's two projects, Data Stream Momentum and Creative Pulse, and their experiments, the Triple Pulse Experiment, the Creative Pulse Notebook, and the Boredom Principle illustrates how her forms influence reality.