Systems Science 8 — Complexity, Organization, and Order
We've looked at the boundary of a system as a way in which one system is delineated or differentiated from the environment. The boundary 'contains' the elements or components of a system. The word 'system' itself implies some kind of internal organization or structure along with function. In this installment we will take a look at the internal milieu of a system and make some general claims about organization that seem to hold for many different kinds of systems at all scales and composition.
What we are trying to get a handle on is the concept of complexity as it applies to the internals of systems, and by our recursive definition of systems the complexity of relations between multiple systems. We have already examined some aspects of how systems evolve from a lesser organized state to one of greater organization. We've also hinted at how subsystems organize and give rise to wholly new functions and interactions with other subsystems. The latter is what we have called emergence in that new structures and functions emerge from lower level subsystems interacting in new ways that become stable, consistent with the energy flows through the overall system.
'Complexity' is a tricky word. Even among scientists who study complexity in its many guises there is not always agreement about what it means or implies. So I want to be careful to delineate how I am using the term and why. Specifically I do not want the word to carry too much burden of meaning. Some people look at highly organized systems with many interacting parts and call that complex. They are not wrong, but the word then implies not only numerous parts of numerous kinds, but also highly specific and possibly stable interrelationships between the parts. The problem arises when one looks at a collection of many component parts of many types and may not see any pattern of relationships that remain stable over time. Is this a lack of organization in the system or a lack of perception and understanding on the part of the observer? In Installment 9, Classes of Systems, I will explain this more thoroughly.
A similar problem exists with the term 'order'. This implies a stable, consistent, perhaps even rigid structuring of the component parts. Complexity, in the sense of organized structure, is sometimes equated with order, but this is not quite right. Order, at least in the sense of the Third Law of Thermodynamics, is of the type of a crystallized system. There are a few well defined structural relations that have become literally frozen in time and space. This can be achieved by lowering the system's temperature to a point where the internal thermal motion is minimized and the components settle into their minimal energy configuration with respect to one another. Or at least they are trapped in a local minimum, with a global structure that minimizes total entropy. The point, however, is that any complex system might be frozen into an ordered state but this wouldn't tell us much about how interesting the system is.
And I think the issue is one of how interesting a system might be. Crystals might be pretty, like a diamond, but they are not terribly interesting as objects go. At the other extreme a gas phase mixture of many kinds of atoms is similarly not very interesting if held at a temperature that prevents interesting chemistry from taking hold. In the first case of the crystal entropy is at a minimum but nothing is happening. In the second case entropy is at a maximum with too much happening, but none of it interesting. What we find interesting is the somewhere in between phase where things are happening, but more than just atoms colliding with one another. What we are really interested in is dynamic organization, that is, where structures can emerge and interact in new ways.
Complexity Defined
Lets start with a straightforward definition of complexity. Then we will relate it to the ideas of energy flows and emerging organizations. In the end, we will see that this is precisely the basis for what we call universal evolution.
I will start by describing a multi-dimensional space in which each dimension measures an attribute of complexity. The first dimension is simply the number of kinds of atomic components. The word 'atomic' here means components that cannot (or need not) be broken down to simpler forms, as in the original Greek usage — atoma. The number of kinds can easily be finite, as in the number of kinds of naturally occurring atoms in the universe. In fact, I sometimes think that an infinite number of kinds would be a meaningless concept. Of course the number of kinds might be extensible just as we have extended the number of kinds of atoms in the periodic table by nuclear chemistry. But this isn't essential to the basic definition.
A second dimension would be the numbers of atoms of each kind — their relative abundance in the system. In the case of the universe, for example, the atoms of hydrogen and helium are by far the most abundant proportional to carbon, nitrogen, oxygen, etc. This dimension is important, but by no means crucial to the potential that complexity offers. We know that the existence of carbon, for example, counts for a great deal when it comes to organization.
A third dimension measures the number of pair wise interactions that are possible between components. This dimension is somewhat more problematic definitionally. In regular chemistry, for instance, interactions are never purely pair wise (e.g. reactions that take place in solution). The best that we can say for this dimension is that it measures the number of rules by which two atomic components might interact in as pure a context as possible. For example one might define a set of 'standard' conditions and then enumerate the ways in which any two components might interact. For example, in chemistry and subatomic particle physics there are definite rules of binding the components together (e.g. number of electrons in the outer shell of an atom determines the kinds of chemical bonds). The need for standard conditions introduces one or more additional dimensions that measure the various aspects of the environment which 'condition' the interactions. Here we see that complexity itself becomes complex to define!
This four+-dimensional approach to measuring complexity is very general. It should be applicable to any level in the ontological hierarchy (quantum physics, chemistry, biology, etc.). The measure of complexity is thus not a single number but a n-tuple (e.g. a list of numbers and other m-tuples separated by commas, where the length of the list is the number of dimensions and the numbers are appropriately scaled). The actual value of n will depend on how many environmental conditions are involved in describing the interactions. A list containing any tuple with low values is likely to describe a relatively non-complex system. For example, the four-tuple, (2, (10100,000, 1020,000), 1, 5), would represent a rather impoverished system with only two component types, one interaction between them and five conditions, even though it has a huge number of components of the two types (the 2-tuple as the second item in the four-tuple). [Note: I cheated a bit here. The last number should actually be a 5-tuple itself, with each number identifying the range of condition parameters, but I thought that might be overkill for this example!]
The salient point is that it is not very useful to think of complexity in too vague terms. Or put another way, complexity is complex! The reason we want more precision in thinking about what it means to be complex is that we can then start to think about what sorts of interesting things a complex system can do under the right circumstances. To be definite, the circumstances I refer to is the flow of energy through the system. Depending on the situation at the boundary and the internal complexity of a system various conditions of energy flow can produce very interesting results. In fact, at some sufficient level of complexity and at just the right flow rate of just the right quality of energies, very interesting structures can form and move and further interact with one another. Emergence can take place. Selection can take place. And organization can take place.
The surface of the Earth receiving the influx of solar energy is such a system. As the planet formed from rocky, dusty, and gaseous debris circling Sol there was much disorganization but great potential in terms of the complexity of the system. Through the combined energy fluxes of sunlight and gravitational forces things happened. The Earth got interesting.
Energy Flow Jostles Components
Energy flow is, itself, a somewhat problematic concept. Energy isn't exactly something you can see or feel, except by its effects on matter. Perhaps the quintessence of pure energy is the photon. The photon is neither a particle nor a wave, but both, more or less simultaneously. Photons move and mediate the electromagnetic force. They travel at the speed of light through a vacuum and interact with electronic particles like electrons and protons. In doing so they can 'move' the particle and this effect is what we mean by energy doing a kind of work.
Sensible heat (what you feel as heat) is a very low energy form of photon, in the infrared radiation range, jostling the electrons and atoms in your skin, in the molecules of water in your cells, mostly. So heat, the lowest grade of energy, has an ability to interact with matter, but depending on the relative temperatures of the heat source and the heat sink it may or may not do any work to speak of. The higher the temperature of the source, relative to the sink, the more 'pressure' will exist to push the energy flow through the intermediate system. The same principle applies to other forms of energy, e.g. when electrons themselves move and mediate the electromagnetic force (i.e. electricity). In that latter example the pressure is called voltage.
In a reasonably complex but not necessarily organized system with an external source of energy, say heat at a high temperature, and an external sink at a low temperature, energy will flow through the system (if the boundary conditions permit!) from the source side to the sink side, jostling any matter that is in the way. The important point here is that the jostling causes our 'atoms' to bang into one another as they become particles mediating a mechanical force. They also may have the capacity to absorb a bit of the higher potential energy in the form of potential to 'stick' together. Chemical bonds created by jostling are an example. But when the atoms stick together, they form more complicated, organized objects. More importantly, these objects may interact with one another in more complicated interrelations.
In addition to 'chemical' interactions the material is mechanically pushed, so to speak, toward the low temperature sink side. In the strictly thermal model we've been using the act of heating particles at the high temperature source causes the particles to initially spread out creating a high pressure zone at that end. At the low temperature end, the particles are cooled and tend to cluster closer together creating a low pressure zone. Matter moves from regions of high pressure to regions of low pressure. That is why we have wind! So the mass starts a more organized movement from the source end to the sink end. At that end, the cooler particles are displaced by incoming hotter particles and tend to move to the outside (not outside the boundary) of the column of hot-to-cold region moving particles. They, in turn continue to move along the periphery toward the hot end where they will get re-heated, setting up a cycle called a convection cycle. Of course a lot depends on the overall geometry of the system, but some form of organized motion is set up by this energy flow through.
So just with the flow through of energy from a high potential source to a low potential sink, and given a sufficient level of general complexity we have an increase in organization, both mechanical and chemical. And that organization will persist so long as energy flows. Note that the entropy of the whole arrangement, the meta-system of energy source, system (of interest), and energy sink is actually increasing as the potential between source and sink tends toward equilibrium. But locally, within the system (of interest) order increases and entropy declines. Turn off the energy flow, however, and the system will return to its original disorganized amorphous state.
Organization Emerges
A lot now depends on the types of components, their relative abundance, and the kinds of interactions that can take place between them. And some luck helps!
The process described above is what we call stochastic. The actual interactions that take place are a result of chance meetings. However, given that the process is strong and the complexity high, the chances for very interesting interactions is high. Some particle interactions or combinations are more stable under the given conditions of the system with its flow through of energy. It will have a characteristic temperature gradient from source to sink. The same applies to pressures and composition gradients. The stable combinations will tend to persist, to be robust along the gradient and in being so change the relative concentrations of 'atoms' within. Combinations that are not stable will end up coming apart and their components will return to the population of particles. Though even the most stable combinations will eventually come apart just due to chance jostling of the wrong kind. However, this happens much less frequently than the coming apart of not-so-stable species. The internal environment and dynamics of the system tend to select for certain stable forms.
Now, again if the system is sufficiently complex, it will happen that some of these stable forms interact with one another in very interesting ways. They can mutually reinforce or benefit one another. For example species A might 'help' species B by having a strong affinity for one or more of the component parts of B but is able to give the parts to the formation of B without itself being destroyed. A simple example is that of a catalyst in chemical reactions. A particularly important form of catalysis is cyclical where A helps B and B helps A. Such a reaction can easily dominate the system leading to a predominance of A and B. More complex loops of mutually reinforcing species are possible. Indeed the origin of life is now believed to be wrapped up in these mutual 'autocatalytic' loops involving precursors of RNA and proteins.
The important thing to recognize in all of this is that the generation of greater organization is driven by the flow through of energy but is also dependent on some elements of chance. For example of two equally potent autocatalytic processes that might have emerged, would be competitive for components, and are equiprobably, chance will decide which one forms first. And that chance formation might make a great difference in what evolves next.
Evolution
So here we are at the beginning of something really interesting. The system (the Earth) is driven by several energy flows. The one with the most evident impact on the emergence and evolution of higher and higher structure is the Sun. Light energy drives life, for the most part. Gravitational energy in the form of creating high temperatures at the core of the planet (and in the mantle) and the tidal dynamics caused by our moon, have certainly contributed to helping organize the way things developed. The fact that the Earth rotates daily means that the high energy input source and the sink (deep space) alternate so that energy flow is a kind of pulsed (sinusoidal) affair at points below the arctic and antarctic circles. At lower latitudes near either pole the pulses are seasonal rather than daily! Nevertheless, life has evolved on this planet in a drive toward increasing organization and order changing the high entropy complexity into low entropy complexity over the past 4.5 billion years.
Another way to reflect on the nature of organized complexity is our old friends information and knowledge. Going back to "Systems Science 5 — Information and Knowledge" we can see that the structure (knowledge) and the generation of messages by the dynamics of those structures (information) are another way to measure complexity.
We've now built a basis for understanding general systems. We have several 'tools' for describing systems and their dynamics and evolution, at least qualitatively. The reader, I hope, appreciates that there are more formal and mathematically sophisticated ways to handle these concepts. But for purposes of introducing the general reader to the concepts of systems we have made some good progress.
In the next installment I will be turning to the taxonomy of systems in preparation for starting to apply systems theory to the real world. We have to be able to differentiate aspects of real-world systems in order to understand some additional important phenomena, but also to apply our tools to systems we encounter in life. My intent is to show that a grounding in systems science will allow anyone to tackle understanding diverse and seemingly disparate objects and processes in this world.
I was hoping your posts would help clarify something I have been struggling over for a while .....The tough question; Does increased complexity of an Ecosystem lead to incresed stability?
I've read numerous accounts both in favour and opposed to the hypothesis and now I find I'm not even sure about the complexity tag-should we be talking about diversity?
Maybe there is no clear cut answer..perhaps increased complexity (that word again!) reduces resistance to change (by external perturbations)but increases resiliance of recovery (from aforementioned perturbations) But does that depend on how we measure and what creteria are used? I'm confused just trying to explain....
The reason the answer is so important is that attempting to increase the complexity of an utopian society, by 'mimicing' natural ecosystem interrelations may have unexpected repercusions- positive or negative.
What are your thoughts on this George?..apologies if you have already covered it or it is forthcoming.
Posted by: GaryA | August 14, 2009 at 08:38 AM
Hi GaryA.
Someday this series may end up in a book form. It is actually just a stream-of-consciousness outline for said potential book. If so I hope to go into much greater detail in this whole complexity/stability thing. I didn't mention Tainter who basically asserts that at some level of complexity social systems are unsustainable. Thomas Homer-Dixon asserts that it is the size scale more than complexity that accounts for degradation and collapse. Jared Diamond covers both ideas to some extent.
From a purely theoretical point of view, I think there is evidence (mostly from computer models) that the stability as a function of complexity goes through a maximum (and one needs to be exceedingly careful about what is measuring as complexity) after which there is a diminishing return -- an inverted U curve.
Some people count redundancy in with complexity. Redundancy does improve recovery/stability up to a point but with obvious costs.
In my own view, as the complexity (as I tried to describe it) increases, e.g. more, larger combinations of atoms, then either emergence of new larger-scale properties that allow the system to retain more potential energy from the energy flow stream, or the combinations increase dissipation which leads to degradation. Of course if a steady state obtains, both could be operating at the same time, but then complexity would no longer increase.
In any case, I think it is still very much a problem with definitions and details. I would hope to dig deeper into this if I do decide to write the book, or maybe in some future blog. I would love to hear more of your ideas though. Feel free to e-mail me if you think it is too much for a blog comment.
George
Posted by: George Mobus | August 14, 2009 at 09:58 AM
That makes sense..reading it I had a vision of the Human brain complex system giving us the emergence of consciousness which at higher levels has a tendancy to become 'less sure'(stable) as it processes more complex and subtle information. Insanity and genius at the tipping point sort of thing..... Babbling this because still have no sign resolution of the complexity/stability dilema!
Will have think and may post further, if need be, via E-mail Thanks for the offer!
Posted by: GaryA | August 15, 2009 at 08:46 AM
I look forward to any thoughts you have on it.
George
Posted by: George Mobus | August 16, 2009 at 01:12 PM