1) The numbers in the display will march off to infinity and
the calculator will overflow;
2) The numbers in the display will enter a cycle of length n,
where n is 2 or larger, but not larger than the number of internal
states of the calculator; or
3) The numbers will converge to a fixed point, x*.
Depending on the calculator design, the number of states might
be the same as the number of distinct values that can be displayed.
(Some calculators keep one or two extra digits after the decimal
which are never displayed.)
It is not hard to discover functions which have fixed points
that can be found by the calculator method.
Such functions are called contraction mappings, and are well
studied in the mathematical literature.
Sir Isaac Newton showed how to discover the roots (zeroes) of
a function, f(x), by constructing a new function, F(x) = x - f(x)/f'(x).
The roots of a function, f(x), are the values of x
for which f(x) = 0.
(Here, f'(x) denotes the first derivative of f(x).)
If the original function, f(x), is "reasonably well-behaved," F(x)
will be a contraction mapping with fixed points at the roots of f(x).
Newton's Method is perhaps the most famous application of the Fixed
Point Theorem, and it is very easy to write an algorithm (computer
program) to implement it.
(As an exercise, it is instructive to find simple functions, f(x),
for which Newton's method cycles without converging to the root,
or diverges to infinity.)
The process of random perturbations is extensively studied
in the literature.
Norbert Weiner is best known for his study of random walks.
To my mind, it is not a coincidence that Weiner was also
a pioneer in the field of cybernetics (the merger of psychology,
feedback control theory, and automatic computation).
Weiner's last book was entitled, God and Golem.
("Golem" is a Yiddish word for an anthropomorphic robot.)
Introduction
There is a famous Theorem in modern mathematics, called
the Fixed Point Theorem, attributed
to L. Brouwer, and later clarified by Kakutani.
To explain the Theorem in simple terms,
consider a mathematical function
which maps each point in a given space onto a
corresponding point in the same space.
A popular version of such a map is an ordinary hand calculator.
Any given calculation (i.e. function) transforms the number
shown in the display into another number shown in the display.
For the calculator example, the space is just the set of possible
numbers that can appear in the display.
The Fixed Point Theorem states the conditions under which
a function possesses a point that maps into itself.
Fixed Points in Ordinary Algebraic Functions
It turns out that it is very common for a function, f(x),
to possess one or more fixed points, x*, such that f(x*) = x*.
Returning to the calculator example, if one repeats a sequence
of operations (i.e. a function) by plugging the output of the
function back in to the input, one of three things will happen:
Fixed Points in More General Maps
Once the idea of fixed points is understood, it is amusing to
apply the idea to "nonmathematical" maps.
On the surface of the earth, there is at all times at least
one point where the wind is calm.
Depending on how one combs one's hair, there may be a fixed
point (center of the whorl) or even a fixed line (part down the middle).
Consider the map call Chemistry.
Chemistry maps the set of elements and molecules onto itself.
There are evidently many cycles in chemistry, but
there are even some fixed points.
DNA is the most important molecule in this regard.
When the laws of Physics and Chemistry are applied
to DNA, it gives back another DNA molecule.
That is, self-replicating molecules are an instantiation
of the Fixed Point Theorem where the map is the one
determined by the laws of Physics and Chemistry.
Stability of Fixed Points, Evolution, and the Fight Against Entropy
Since fixed points of contraction mappings
and their cousins, the cycles, persist (through continual re-creation),
they represent stable states.
Small perturbations which arrive at random, may disturb the system.
Depending on the nature of the perturbation, the system
will either return to the original stable state, or it will
"derail" and wander off until a new fixed point is encountered.
The random perturbations come about because of the quantum
nature of the universe.
Sometimes (quite by accident, if you will)
the new fixed point is more stable than the old one;
it is then even harder to derail: it persists longer against the odds.
Evolution is evidently the process of moving to ever stabler
fixed points, working against the force of Entropy (the destroyer)
which leads back to decay and disorder.
The agent of evolution is random perturbations.