Logistic map

1

The logistic map is a polynomial mapping (equivalently, recurrence relation) of degree 2, often referred to as an archetypal example of how complex, chaotic behaviour can arise from very simple nonlinear dynamical equations. The map, initially utilized by Edward Lorenz in the 1960s to showcase irregular solutions (e.g., Eq. 3 of ), was popularized in a 1976 paper by the biologist Robert May, in part as a discrete-time demographic model analogous to the logistic equation written down by Pierre François Verhulst. Mathematically, the logistic map is written where xn is a number between zero and one, which represents the ratio of existing population to the maximum possible population. This nonlinear difference equation is intended to capture two effects: The usual values of interest for the parameter r are those in the interval [0, 4] , so that xn remains bounded on [0, 1] . The case of the logistic map is a nonlinear transformation of both the bit-shift map and the case of the tent map. If r > 4 , this leads to negative population sizes. (This problem does not appear in the older Ricker model, which also exhibits chaotic dynamics.) One can also consider values of r in the interval [−2, 0] , so that xn remains bounded on [−0.5, 1.5] .

Characteristics of the map

Behavior dependent on r

The image below shows the amplitude and frequency content of some logistic map iterates for parameter values ranging from 2 to 4. By varying the parameter r, the following behavior is observed: r − 1⁄r , independent of the initial population. r − 1⁄r , but first will fluctuate around that value for some time. The rate of convergence is linear, except for , when it is dramatically slow, less than linear (see Bifurcation memory). δ ≈ 4.66920 . This behavior is an example of a period-doubling cascade. r ≈ 3.56995 is the onset of chaos, at the end of the period-doubling cascade. From almost all initial conditions, we no longer see oscillations of finite period. Slight variations in the initial population yield dramatically different results over time, a prime characteristic of chaos. 2kc . This sequence of sub-ranges is called a cascade of harmonics. In a sub-range with a stable cycle of period 2kc , there are unstable cycles of period 2kc for all k < k . The r value at the end of the infinite sequence of sub-ranges is called the point of accumulation of the cascade of harmonics. As r rises there is a succession of new windows with different c values. The first one is for , almost all initial values eventually leave the interval [0,1] and diverge. The set of initial conditions which remain within [0,1] form a Cantor set and the dynamics restricted to this Cantor set is chaotic. For any value of r there is at most one stable cycle. If a stable cycle exists, it is globally stable, attracting almost all points. Some values of r with a stable cycle of some period have infinitely many unstable cycles of various periods. The bifurcation diagram at right summarizes this. The horizontal axis shows the possible values of the parameter r while the vertical axis shows the set of values of x visited asymptotically from almost all initial conditions by the iterates of the logistic equation with that r value. The bifurcation diagram is a self-similar: if we zoom in on the above-mentioned value r ≈ 3.82843 and focus on one arm of the three, the situation nearby looks like a shrunk and slightly distorted version of the whole diagram. The same is true for all other non-chaotic points. This is an example of the deep and ubiquitous connection between chaos and fractals. We can also consider negative values of r:

[Bifurcation diagram for the logistic map.

The attractor for any value of the parameter r is shown on the vertical line at that r. | upload.wikimedia.org/wikipedia/commons/5/50/Logistic///Bifurcation///map///High///Resolution.png]

Chaos and the logistic map

[[Image:LogisticCobwebChaos.gif|thumb|right|A [[cobweb diagram]] of the logistic map, showing chaotic behaviour for most values of r > 3.57 ]] Iterated logistic functions.svg f (blue) and its iterated versions f , f , f and f for . For example, for any initial value on the horizontal axis, f gives the value of the iterate four iterations later.]] The relative simplicity of the logistic map makes it a widely used point of entry into a consideration of the concept of chaos. A rough description of chaos is that chaotic systems exhibit a great sensitivity to initial conditions—a property of the logistic map for most values of r between about 3.57 and 4 (as noted above). A common source of such sensitivity to initial conditions is that the map represents a repeated folding and stretching of the space on which it is defined. In the case of the logistic map, the quadratic difference equation describing it may be thought of as a stretching-and-folding operation on the interval (0,1) . The following figure illustrates the stretching and folding over a sequence of iterates of the map. Figure (a), left, shows a two-dimensional Poincaré plot of the logistic map's state space for , and clearly shows the quadratic curve of the difference equation. However, we can embed the same sequence in a three-dimensional state space, in order to investigate the deeper structure of the map. Figure (b), right, demonstrates this, showing how initially nearby points begin to diverge, particularly in those regions of xt corresponding to the steeper sections of the plot. This stretching-and-folding does not just produce a gradual divergence of the sequences of iterates, but an exponential divergence (see Lyapunov exponents), evidenced also by the complexity and unpredictability of the chaotic logistic map. In fact, exponential divergence of sequences of iterates explains the connection between chaos and unpredictability: a small error in the supposed initial state of the system will tend to correspond to a large error later in its evolution. Hence, predictions about future states become progressively (indeed, exponentially) worse when there are even very small errors in our knowledge of the initial state. This quality of unpredictability and apparent randomness led the logistic map equation to be used as a pseudo-random number generator in early computers. At r = 2, the function rx(1-x) intersects y = x precisely at the maximum point, so convergence to the equilibrium point is on the order of. Consequently, the equilibrium point is called "superstable". Its Lyapunov exponent is -\infty. A similar argument shows that there is a superstable r value within each interval where the dynamical system has a stable cycle. This can be seen in the Lyapunov exponent plot as sharp dips. Since the map is confined to an interval on the real number line, its dimension is less than or equal to unity. Numerical estimates yield a correlation dimension of 0.5 (Grassberger, 1983), a Hausdorff dimension of about 0.538 (Grassberger 1981), and an information dimension of approximately 0.5170976 (Grassberger 1983) for r ≈ 3.5699456 (onset of chaos). Note: It can be shown that the correlation dimension is certainly between 0.4926 and 0.5024. It is often possible, however, to make precise and accurate statements about the likelihood of a future state in a chaotic system. If a (possibly chaotic) dynamical system has an attractor, then there exists a probability measure that gives the long-run proportion of time spent by the system in the various regions of the attractor. In the case of the logistic map with parameter and an initial state in (0,1) , the attractor is also the interval (0,1) and the probability measure corresponds to the beta distribution with parameters and . Specifically, the invariant measure is Unpredictability is not randomness, but in some circumstances looks very much like it. Hence, and fortunately, even if we know very little about the initial state of the logistic map (or some other chaotic system), we can still say something about the distribution of states arbitrarily far into the future, and use this knowledge to inform decisions based on the state of the system.

Graphical representation

The bifurcation diagram for the logistic map can be visualized with the following Python code:

Special cases of the map

Upper bound when

0 ≤ r ≤ 1 Although exact solutions to the recurrence relation are only available in a small number of cases, a closed-form upper bound on the logistic map is known when 0 ≤ r ≤ 1 . There are two aspects of the behavior of the logistic map that should be captured by an upper bound in this regime: the asymptotic geometric decay with constant r, and the fast initial decay when x0 is close to 1, driven by the (1 − xn) term in the recurrence relation. The following bound captures both of these effects:

Solution when

r = 4 The special case of r = 4 can in fact be solved exactly, as can the case with r = 2 The solution when r = 4 is, where the initial condition parameter θ is given by For rational θ, after a finite number of iterations xn maps into a periodic sequence. But almost all θ are irrational, and, for irrational θ, xn never repeats itself – it is non-periodic. This solution equation clearly demonstrates the two key features of chaos – stretching and folding: the factor 2n shows the exponential growth of stretching, which results in sensitive dependence on initial conditions, while the squared sine function keeps xn folded within the range [0,1] . For r = 4 an equivalent solution in terms of complex numbers instead of trigonometric functions is where α is either of the complex numbers with modulus equal to 1. Just as the squared sine function in the trigonometric solution leads to neither shrinkage nor expansion of the set of points visited, in the latter solution this effect is accomplished by the unit modulus of α. By contrast, the solution when r = 2 is for x0 ∈ [0,1) . Since (1 − 2x0) ∈ (−1,1) for any value of x0 other than the unstable fixed point 0, the term (1 − 2x0)2 n goes to 0 as n goes to infinity, so xn goes to the stable fixed point 1⁄2.

Finding cycles of any length when

r {{=}} 4 For the case, from almost all initial conditions the iterate sequence is chaotic. Nevertheless, there exist an infinite number of initial conditions that lead to cycles, and indeed there exist cycles of length k for all integers k > 0 . We can exploit the relationship of the logistic map to the dyadic transformation (also known as the bit-shift map) to find cycles of any length. If x follows the logistic map xn + 1 = 4xn(1 − xn) and y follows the dyadic transformation then the two are related by a homeomorphism The reason that the dyadic transformation is also called the bit-shift map is that when y is written in binary notation, the map moves the binary point one place to the right (and if the bit to the left of the binary point has become a "1", this "1" is changed to a "0"). A cycle of length 3, for example, occurs if an iterate has a 3-bit repeating sequence in its binary expansion (which is not also a one-bit repeating sequence): 001, 010, 100, 110, 101, or 011. The iterate 001001001... maps into 010010010..., which maps into 100100100..., which in turn maps into the original 001001001...; so this is a 3-cycle of the bit shift map. And the other three binary-expansion repeating sequences give the 3-cycle 110110110... → 101101101... → 011011011... → 110110110.... Either of these 3-cycles can be converted to fraction form: for example, the first-given 3-cycle can be written as 1⁄7 → 2⁄7 → 4⁄7 → 1⁄7. Using the above translation from the bit-shift map to the r = 4 logistic map gives the corresponding logistic cycle 0.611260467... → 0.950484434... → 0.188255099... → 0.611260467.... We could similarly translate the other bit-shift 3-cycle into its corresponding logistic cycle. Likewise, cycles of any length k can be found in the bit-shift map and then translated into the corresponding logistic cycles. However, since almost all numbers in [0,1) are irrational, almost all initial conditions of the bit-shift map lead to the non-periodicity of chaos. This is one way to see that the logistic map is chaotic for almost all initial conditions. The number of cycles of (minimal) length for the logistic map with (tent map with ) is a known integer sequence : 2, 1, 2, 3, 6, 9, 18, 30, 56, 99, 186, 335, 630, 1161.... This tells us that the logistic map with has 2 fixed points, 1 cycle of length 2, 2 cycles of length 3 and so on. This sequence takes a particularly simple form for prime k: 2 ⋅ 2k − 1 − 1⁄k . For example: 2 ⋅ 213 − 1 − 1⁄13 = 630 is the number of cycles of length 13. Since this case of the logistic map is chaotic for almost all initial conditions, all of these finite-length cycles are unstable.

Universality

Period-doubling route to chaos

In the logistic map, we have a function, and we want to study what happens when we iterate the map many times. The map might fall into a fixed point, a fixed cycle, or chaos. When the map falls into a stable fixed cycle of length n, we would find that the graph of f_r^n and the graph of x\mapsto x intersects at n points, and the slope of the graph of f_r^n is bounded in (-1, +1) at those intersections. For example, when r=3.0, we have a single intersection, with slope bounded in (-1, +1), indicating that it is a stable single fixed point. As r increases to beyond r=3.0, the intersection point splits to two, which is a period doubling. For example, when r=3.4, there are three intersection points, with the middle one unstable, and the two others stable. As r approaches r = 3.45, another period-doubling occurs in the same way. The period-doublings occur more and more frequently, until at a certain, the period doublings become infinite, and the map becomes chaotic. This is the period-doubling route to chaos.

Scaling limit

Looking at the images, one can notice that at the point of chaos, the curve of looks like a fractal. Furthermore, as we repeat the period-doublings, the graphs seem to resemble each other, except that they are shrunken towards the middle, and rotated by 180 degrees. This suggests to us a scaling limit: if we repeatedly double the function, then scale it up by \alpha for a certain constant \alpha:then at the limit, we would end up with a function g that satisfies. This is a Feigenbaum function, which appears in most period-doubling routes to chaos (thus it is an instance of universality). Further, as the period-doubling intervals become shorter and shorter, the ratio between two period-doubling intervals converges to a limit, the first Feigenbaum constant. The constant \alpha can be numerically found by trying many possible values. For the wrong values, the map does not converge to a limit, but when it is, it converges. This is the second Feigenbaum constant.

Chaotic regime

In the chaotic regime, f^\infty_r, the limit of the iterates of the map, becomes chaotic dark bands interspersed with non-chaotic bright bands.

Other scaling limits

When r approaches, we have another period-doubling approach to chaos, but this time with periods 3, 6, 12, ... This again has the same Feigenbaum constants. The limit of is also the same Feigenbaum function. This is an example of universality. We can also consider period-tripling route to chaos by picking a sequence of such that r_n is the lowest value in the period-3^n window of the bifurcation diagram. For example, we have, with the limit. This has a different pair of Feigenbaum constants. And f^\infty_rconverges to the fixed point to As another example, period-4-pling has a pair of Feigenbaum constants distinct from that of period-doubling, even though period-4-pling is reached by two period-doublings. In detail, define such that r_n is the lowest value in the period-4^n window of the bifurcation diagram. Then we have, with the limit. This has a different pair of Feigenbaum constants. In general, each period-multiplying route to chaos has its own pair of Feigenbaum constants. In fact, there are typically more than one. For example, for period-7-pling, there are at least 9 different pairs of Feigenbaum constants. Generally,, and the relation becomes exact as both numbers increase to infinity:.

Feigenbaum universality of 1-D maps

Universality of one-dimensional maps with parabolic maxima and Feigenbaum constants,. The gradual increase of G at interval [0, \infty) changes dynamics from regular to chaotic one with qualitatively the same bifurcation diagram as those for logistic map.

Renormalization estimate

The Feigenbaum constants can be estimated by a renormalization argument. (Section 10.7, ). By universality, we can use another family of functions that also undergoes repeated period-doubling on its route to chaos, and even though it is not exactly the logistic map, it would still yield the same Feigenbaum constants. Define the family The family has an equilibrium point at zero, and as r increases, it undergoes period-doubling bifurcation at. The first bifurcation occurs at r = r_0 = 0. After the period-doubling bifurcation, we can solve for the period-2 stable orbit by, which yields At some point r = r_1, the period-2 stable orbit undergoes period-doubling bifurcation again, yielding a period-4 stable orbit. In order to find out what the stable orbit is like, we "zoom in" around the region of x = p, using the affine transform. Now, by routine algebra, we havewhere. At approximately S(r) = 0, the second bifurcation occurs, thus. By self-similarity, the third bifurcation when, and so on. Thus we have, or. Iterating this map, we find, and. Thus, we have the estimates, and. These are within 10% of the true values.

Relation to logistic ordinary differential equation

The logistic map exhibits numerous characteristics of both periodic and chaotic solutions, whereas the logistic ordinary differential equation (ODE) exhibits regular solutions, commonly referred to as the S-shaped sigmoid function. The logistic map can be seen as the discrete counterpart of the logistic ODE, and their correlation has been extensively discussed in literature

Occurrences

In a toy model for discrete laser dynamics: , where x stands for electric field amplitude, G is laser gain as bifurcation parameter.

This article is derived from Wikipedia and licensed under CC BY-SA 4.0. View the original article.

Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc.
Bliptext is not affiliated with or endorsed by Wikipedia or the Wikimedia Foundation.

View original