Friday, September 14, 2012

Bread, chaos and Lyapunov exponents

Brood, chaos en Lyapunov exponenten


"Alle zorgen zijn minder met brood", schreef de auteur van Don Quichote, Miguel de Cervantes. Wie kan hem ongelijk geven, het is waar, de toevoeging van brood bij een maaltijd, of op zichzelf, brood is fantastisch eten. Ik laat hier zien wat er in het deeg omgaat waardoor het brood zo 'chaotisch' goed wordt.


"All sorrows are less with bread", wrote the author of Don Quixote, Miguel de Cervantes. Who can argue with him, for it is true, either the addition of bread to a meal or taken as a meal by itself, bread is a wonderful food. The way of preparing, especially the kneading of the dough, is quite important. I will show what happens with the dough that makes bread 'chaotically' good.


Start folding the dough
A good bread needs yeast. Unleavened is not my kind of thing, although I can understand if you're walking for forty days through the desert yeast is hard to come by or as the story goes you are in such a hurry that you don't want to wait for it to rise. A nice thing about yeast is it will produce more carbon dioxide when heated. The yeast cells go "into overdrive" and try to get as much out of life as possible before their demise, by eating as much of the carbohydrates as they are able and expel carbon dioxide in the process. The carbon dioxide makes the bread fluffy. To not get one big bubble of gas inside your bread the yeast has to be mixed in thoroughly. Oh, and to think that mixing is a trivial matter!

I would like to explain what mixing means in a mathematical sense. There is a disclaimer though, I'm no mathematician but a physicist, so I make no claims of rigour on the details; I will sacrifice it for understanding. Some of the concepts are also used in physics, which is not surprising since we spend time trying to formally work out how nature works, and that means calculating stuff. We will encounter the widely used but sparsely understood notion adapted from a mythical primordial concept to our modern needs: Chaos.

Around the end of the nineteenth century physics was seeing a shift in paradigms. The understanding of the very small and slow with quantum mechanics, the understanding of the very large and fast with relativity, and a third, less visible shift, the understanding of complexity with chaos theory.

Now, a complex system is not synonymous with a complicated system. Some complex systems may be governed by only a few parts. However it is the way the parts interact that gives rise to emergent phenomena like chaos. Strictly speaking it is the "non-linear interaction or feedback" in a complex system that causes chaotic behaviour. Non-linear simply means that the output of a system is not proportional to the input. But, given one or more feedback loops, the output may display some quite complicated behaviour. What actually is chaotic behaviour? It has something to do with unpredictability, but what does that mean? 

To answer that question we must first take a detour in to the physics of the second half of the nineteenth century and see that gasses where quite the fashion those days. Steam engines were being developed and people had been doing a lot of experimentation already with compressing and expanding. Although fizzy drinks were around for almost a hundred years already. Some physicists realised that a gas could be described as little molecules bouncing about, and the kinetic theory was born. Describing gasses as a huge amount of little bouncing balls calls for the need for math and physics that can handle large amounts of numbers, statistical mechanics. Although the trajectories of all the single particles cannot be known, the macroscopic properties could be found. But that was not all.

In some way there is more to be understood about these systems. Statistical mechanics introduced the notion of 'ergodicity'. A physical system has variables which describe the state of that system . The collection of all possible values of these variables is called the phase space of the system. One state is characterised by a single point in this space. A (stochastic) dynamical system changes its state and this transformation can be seen as a trajectory through this phase space. There is a hypothesis which states: When the time average of the transformation of a single point (average single trajectory), and the space average over all the phase space points are the same the transformation is ergodic. In effect it means that when one waits long enough, all phase space points (trajectories) will be visited. Like a gas, a chaotic system seems to do just that.
After each iteration the points in the phase space smear out, demonstrating the ergodic nature of the system.
The trajectory through phase space maps out a figure which has interesting properties. After some initial movements the trajectory ends up in a so-called attractor. It can be a single point, or a loop, or multiple loops. One has to realise that a single point in phase space signifies the complete state of that system. When one knows the dynamical equations there can be only one next point, there are no multiple choices. Therefore it is not possible for the trajectory to cross itself or another. That would mean there is a "crossroad" and an indeterminacy. When a system is classical there can be only one road to follow. However, it is possible for the trajectories to get arbitrarily close, and since they can be infinitesimally close there is no reason why an attractor could not be bound in phase space, while not crossing itself. This will lead to what is known as a strange attractor.

A "toy model" of a real world example of chaotic behaviour is the kneading of bread. A simplified way of stating the kneading process can be summarised by the aptly named Baker's transformation and the eventual map that follows from it when we apply multiple transformations.

The Baker's transformation is defined as follows: 

$B(x,y) = (2x,y/2), x \lt r$
$B(x,y) = (2x-1,(y+1)/2), x \ge r$.


This signifies a folding in one direction and stretching in another like the kneading of dough. Usually r is taken to be 0.5, folding in half.


For certain values of the folding parameter,
the values of x end up all over the place, giving
rise to chaos.
The Baker's transformation is a chaotic system. That is to say, that there exist trajectories that have a strange attractor. This can be shown in a bifurcation diagram, where the parameter of the transformation is plotted against the x-position after (infinitely) many iterations. One gets a rather fetching diagram. The diagram is made for one coordinate only depending on the folding parameter r.  For some values the x coordinate oscillates between certain values, the lines in the diagram. 

To see what it means in this case, we take a point at x=0.7 and we fold at r, then we stretch and fold again, ad infinitum. The point x will come to lie on different positions. Sometimes it comes back to itself, in which case the whole thing repeats itself exactly in a limit cycle. But sometimes it doesn't. One can see that there are regions in which there are no more limit cycles and the structure becomes very unorganised, detailed and fractal in nature. This is a sign of a strange attractor and usually of chaotic behaviour.


There is an important theorem which holds for dynamical systems; Liouville's theorem states that the volume of a set of points in phase space stays the same when a conserved system is evolving. For dissipative systems, which lose energy, the volume shrinks over time.

The fact that the phase space trajectories cannot cross, combined with Liouville's theorem can lead to interesting results. Because the density of some trajectories stays the same but the system evolves to new unvisited states, the "room" for navigation in a bound part of that phase space becomes smaller. The only solution for the trajectories is to get closer and closer. In doing this they make self similar figures that are called fractals. In a certain sense the trajectories get so close together that they have to squeeze "in another dimension", creating an object (attractor) which has a fractal dimension.

Take two trajectories close to each other. When at a certain time t the distance between them grows exponentially, a trajectory is said to be chaotic. Alexandr Lyapunov (Алекса́ндр Ляпуно́в) was a russian mathematician who lived in the time science saw the latest revolutions of which he himself was one of the architects. He developed a coefficient to determine if two orbits indeed fly apart exponentially or stay close forever or even come to a halt in a point. He took two points close to each other and looked what happened after an infinite amount of time. The Lyapunov exponent for a certain point is hence defined: 

$\lambda(x_0) = \lim n\to\infty \frac{1}{n}  \sum_i^{n-1} \ln M'(x_i)  $.

This says that if we apply a transformation M a number of n times for a point x, where n ideally goes to infinity, we will get a measure, $\lambda$, of how two points end up to be, when they are initially close together. M can also be seen as a trajectory in this respect (transformation after transformation) and we are looking at the tangent direction M' in which these trajectories point. If they tend to diverge exponentially, we have chaos.

The regimes for lambda are either positive in the case of a chaotic behaviour, zero in the case of a stable limit cycle attractor, and negative for extreme stability where everything ends up in the same stable point attractor.




To fold bread in this way incorporates the yeast in an even fashion. This way as soon as you bake it the yeast will start to produce carbon dioxide in many places giving rise to a complex and rich internal structure. A video shows how the yeast cells go into overdrive when heated and start producing a lot of CO2. We are committing yeasticide for a good cause, the fluffiness of the final product. The internal structure of the bread, if the yeast has been incorporated in a fractal fashion will look self similar. This gives bread its airy yet strong texture.
In de maak...

No comments:

Post a Comment