Technology: The Great Salvation?
Three Mile Island; Chernobyl; now Fukushima. Nuclear power may be in trouble. Exxon Valdiz; Deepwater Horizon; Fraking (sic) for oil and gas. Fossil fuels may be in trouble. Computer security breeches. The Internet may be in trouble. These are all examples of very complex technological systems that have not performed as well as they were supposed to. There are many other examples.
“Ah, but a man's reach should exceed his grasp, or what's a heaven for?”, Robert Browning
Indeed, what is a heaven for? Humanity's reach does exceed its grasp. We can touch the outer reaches of technological wonders, but we don't seem able to grab hold in any stable way. Our most advanced and complex technologies are failing to serve us as expected. What drives us to create greater complexity? What purpose is served by greater complexity? Are we better off touching greater complexity and then dropping it, so to speak, on our toes because we have no real grasp?
One answer to the ‘what drives us’ question was given by Joseph Tainter in The Collapse of Complex Societies. He posits that the increase in complexity comes from our attempt to solve problems. We invent things and procedures aimed at making life better or easier or thwarting a danger we perceive. But it turns out that very often the solution to one kind of problem introduces unintended consequences and thus generates new problems. Ergo we are on a spiral of ever increasing complexity of problems and complexity of solutions.
Tainter also posits that at some point the benefits gained by solving problems with more complexity start to diminish, and, indeed, can start to produce negative results. That is, there is a law of diminishing returns that attend increasing complexity. He uses this framework to analyze the demise of civilizations throughout history.
One way to generalize the drive to greater complexity is to recognize what constitutes that which we humans consider problems. We have our biological needs to meet, enough food, water, shelter, sex, etc. Our problems start when any of these are in jeopardy or hard to get. We invent tools to make the getting easier and more reliable. Then the same kind of thinking extends to our other needs in the Maslowian hierarchy. Back when energy resources were limited to real-time (animal and water power) and limited-storage (wood fuel) there was a very direct link between having enough energy to invest in inventions to aid in achieving our needs and the perception that we were, indeed, solving problems effectively. But as fossil fuels started to play an increasing role in providing extrasomatic power, humans found it possible to not only invent to fill needs but to fill wants as well. We started getting spoiled.
Then in the 20th century we learned how to create artificial needs by linking wants to those biological needs. Advertising and marketing turned those automobiles from merely providing convenient personal travel into representations of our sexuality and social power. Now invention for the purpose of solving our ‘created’ needs has taken on a life of its own. Every little inconvenience looks like a problem to be solved. If we don't get instant gratification from our material wealth, something is drastically wrong and needs more invention. Heaven is being able to have no limits on our ability to fill our desires.
That takes care of the reach but there is the grasp part to consider. Our best engineering practices cannot seem to prevent our inventions from failing and doing so catastrophically. Every time we solve a local problem we create a global one worse than what we started with. For example, in our attempts to solve our energy needs we have created a monstrous infrastructure that is starting to consume more energy than it produces while simultaneously destroying our environment. Our attempts to keep the oil flowing have just gotten us into a situation where we are using too much of the oil that we do get in attempting to get the next unit of oil! Even as we see the problems associated with obtaining more oil, we look to new and more complex technologies that will allow us to extract the marginal resources such as tar sands and ultra-deep water pockets.
As long as each human in the population is out to solve their own local set of problems, and expect that that will be accomplished by putting pressure on politicians, we will continue to fail in recognizing that the global problems are getting worse. We will push for more technology to solve our problems. We will, for example, insist on things like increased drilling in marginal areas, requiring much more elaborate and costly infrastructure, even as the net return goes negative. When the lights go out for long periods due to peak fossil fuel energy production we will insist on using nuclear in spite of what we've seen. The greens will insist on conversion to alternative (renewable) energy sources even though that means complexifying our electrical distribution system further and converting to electric transportation, which is complex electronically. It won't matter that there really isn't enough time to carry that plan out. Nor will it matter how much more at risk some of us will be when some of that complex infrastructure fails. We will do it because we are spoiled and deeply believe that solving problems with technology is what we have to do.
President Obama believes (as do so many others) that we will innovate our way out of the energy/global warming crises. He believes in “clean coal”, for example. He probably has very little idea about the nature of the technology required to remove CO2 from the emissions stream and then sequester it ‘somewhere’. I doubt that Steven Chu, Energy Secretary, has bothered to explain to him how complex such an operation would be and what risks are associated with, for example, storing the gas in underground reservoirs. I say this because nobody actually knows how complex the operation might end up being or what risks we should consider. What should be a telling sign about this was the continually escalating cost estimates for the FutureGen power plant proposal. Anytime a ‘system’ gets more expensive you can bet that the culprit is escalating complexity. Another classic and damning story is that of the M2 Bradley fighting vehicle (see the film The Pentagon Wars for a black humor perspective).
Let's face it. We are not just addicted to oil (and high powered fossil fuels), we are addicted to technology as our great savior. For virtually all of mankind's history we got a ‘high’ on technology. Partly this was because technology seemed to make us more comfortable and partly it stems from the novelty factor. New solutions are almost always exciting and promising. For most of that time it did seem to help us solve problems even if it did tend to get more complex over time. We had plenty of energy to invest in complexity. Why not do it? But then we ended up addicted. Like all addicts we are hooked and cannot possibly imagine that this isn't the right thing to do, or at least that it isn't really that bad to do. Belief is a wonderful thing when what you believe in also happens to correspond with reality.
In reality technology has always been a two edged sword, not only in the sense that it can be used for good or bad (e.g. television had the potential to be a great liberating/educational medium), but also in the sense that for every supposed benefit it provides there is a cost associated. All too often the costs are hidden from plain view, or we don't look hard enough because we are so enamored with the surficial benefit (iPhones come to mind). And, as Tainter points out, too often the complexity associated with increasing technology (and here I mean both tool inventions and things like organizational bureaucracies and procedures), carries hidden costs that leave us with net negative benefit in reality.
It seems to me we are witnessing the time when increasing technological complexity is obviously costing more than we gain. What makes this even worse is the reliance on fossil fuels for power to keep the technology going. We are already in the time of decreasing net energy to use in our economy which means we cannot even maintain what we have, let alone build up some new massive and complex technology, such as huge solar collectors in the deserts, that scale to where our current population and consumption requirements sit.
The terrible tragedy playing out in Japan right now has some deeper lessons to offer. My sincere condolences go out to all the victims of the quake and tsunami. But the level of destruction and the dangers of nuclear radiation attest to the degree to which we always assume everything will be all right and put our trust in complex technology. And the more complex, the greater the fail when things go wrong. I am not holding out hope that this example will wake up anyone with influence. I don't expect to see the vast majority of humanity suddenly come to grips with their addictions. Beliefs that do not correspond with reality, like a belief in a heaven, do not get routed easily. People don't usually voluntarily question everything.
George, I do not believe that technology is inherently too complex. I think that most complexity arises from the mismatch between theory and operational requirements.
You will appreciate a computer programming analogy. If you have a good grasp of your problem and the proper architecture to address it then it is possible to make very large and complicated programs that are easy to maintain and nearly bug free. However if you do not have good requirements or you have to guess about the architecture, then you find that while it is possible to create something that works "well enough" that the mismatches have to be programmed outside the architectural framework and create huge complexity and rigidity. In this instance bugs can multiply exponentially with size. My uncle told me long ago that all truly good programs are created once based on what you think you need to do and then recreated from scratch once you know what you need.
I think this is true of society as well. Take this documentary by Adam Curtis about light water reactors...the exact ones that are having problems in Japan. The inventor of the technology explicitly states the design was never meant to be scaled up for industrial use and that by doing so they need a huge amount of complexity in order to try to make them "safe" but that it was impossible to ever be sure that any system would work. By contrast, there are nuclear reactor designs that were made to be large scale and are inherently safe due to the basic laws of physics.
The selection of LWR was done for business and political reasons. This comment left on a post of my explains the background.
It is the exact same for the internet which was designed for small scale point to point use. The other examples may have been designed for their purpose in mind but they are the (very annoying) example of needing to have technological solutions that try to paper over a fundamental flaw in our socioeconomic system.
I think that this reaction makes sense. When technology or knowledge is developed that can radically change societal makeup the choices are to change the foundations of society to conform to the new idea or change the idea to try to be an incremental improvement to society. The latter is taken for obvious reasons, the most glaring being profit and power. Over time these mismatches add up and we get the Tainter observation.
However, I strongly believe that if a society had a symbiotic relationship with its technological foundations in which the social laws were self consistent with the scientific ones and adjusted as required, then you would not have the problem of out of control complexity. This is impossible to fully perform in practice, but it is a sufficient basis for a truly alternative ideology that I wish systems/peak oil people would work on bringing into reality themselves.
Posted by: mikkel | March 14, 2011 at 11:23 PM
A solar power systems is interspersed between arrays of photovoltaic cells so that it could effectively provide heating to all corners of the building. The insulation pipes and ducts of the heating system can be minimized, and thus, save more on building construction cost while building a built-in alternative electrical power.
[Blogger's note: I've left the comment in but removed the link to the commercial site. No advertisements allowed. This comment verges on being an ad so if the originator repeats attempting to post a link to a commercial site I will mark their IP as a spam source.]
Posted by: solar power system | March 15, 2011 at 02:06 AM
George,
Good analysis of the complexity issue. It occurs to me that the agricultural 'revolution' was one of the seminal 'increasing complexity' events in human history. Once we went down that road, the very definition of civilization came to be, in all of its complex glory.
I've been reading up on various nutritional studies and points of view lately. The concept of 'diseases of civilization' hearkens back to this time in that these diseases did not exist to any detectable degree before the ag revolution. You can mark this turning point as the beginning of so many human ills, it's tempting to change the phrase to the 'disease of civilization'.
What a predicament.
Posted by: Eric Thurston | March 15, 2011 at 10:30 AM
George, presumably in your 'day job' you have played your own small part in increasing the complexity of civilization. Did you realise that was what you were doing at the time, and were you troubled by it (without necessarily knowing why)?
Posted by: Julian Colander | March 15, 2011 at 03:29 PM
Mikkel,
You are right that technology by itself is not inherently 'over' complex. But the problem is that the system of interest isn't just technology but the human society in which it is embedded and exploited. So as you yourself point out, it is our mistakes in our use of technology (as well as its ill-conceived implementation) that are problematic.
If there is any solution to this situation it will likely resemble that taken in biological evolution, which I have tried to explain in my systems science working papers. Essentially biology found ways to support higher degrees of complexity by evolving hierarchical control structures (think the range from prekaryotic cells to multicellular organisms and populations.) Our brains represent the epitome of this evolution -- just not quite enough so!
I suspect that if Homo makes it through the bottleneck that future societies might reflect this same basic pattern and governance will be organized into a more structured system that will be capable of managing the complexity of human society.
-----------------------------
Solar,
I'm watching you!
-----------------------------
Eric,
Thanks. In my sapience work I mark the advent of agriculture as the point at which selection for greater sapience got reduced in favor of logistical and tactical organization thinking skills. The strategic thinking required for true wisdom was thus not further developed except in the occasional rare individual -- those we recognize today as wise.
Jared Diamond wrote about the connection between disease and animal husbandry in Guns, Germs, and Steel.
------------------------------
Julian,
I certainly have thought about this a lot. No, when I got into the field (at age 30 something) I did not think about it this way. So I merely plowed ahead.
On the other hand, as Mikkel (above) pointed out, technology is not by itself inherently complex in the negative sense. And as I mentioned to him, complexity has been dealt with in biological evolution on many levels. So by itself complexity is not bad or evil. It is our inability to manage it, especially our inability to recognize our own managerial limitations that is the problem. That is what I have come to recognize.
These days, in my day job, I actually teach more about the inherent trade offs in increasing complexity and the law of diminishing returns to my students. I use examples from microcomputer architecture and operating systems designs to show that increasing the functional complexity always carries a cost that may not be obvious to the designer. The history of architecture evolution is full of such examples. But, thanks to Moore's law we've mostly been on an improved performance trajectory. The limits of increasing complexity, such as are exhibited in the number of stages in a modern pipelined CPU do come up from time to time.
Then, of course, there are the complexities of Microsoft products!!!!
George
Posted by: George Mobus | March 15, 2011 at 05:46 PM
This discussion reminded me one of my favorite quotations:
“Upon this gifted age, in its dark hour falls from the sky a meteoric shower of facts. They lie unquestioned, uncombined. Wisdom enough to leach us of our ill is daily spun; but there exists no loom to weave it onto fabric." -- Edna St. Vincent Millay
(I think this is likely a favorite of systems thinkers. In doing a quick search for it, I found that it was quoted by Murray Gell-Mann at the end of a talk.) We simply do not have the loom. Universities could be the loom, but their mission these days is to train people to fit into the consumer society. Religion, almost by definition, imposes too many boundaries and religious institutions are also bound to the consumer society in order to generate operating/growth funds. (Anthony Mozilo, Countrywide CEO, was until just recently a trustee of Gonzaga U.) Corporations are too short-term focused to lead the way. It is a worthwhile enterprise, in my opinion, to attempt to envision what a loom/system of this nature (clue?) would look like.
We can't handle complexity because we do not have the loom/system to handle complexity.
Posted by: Matt Holbert | March 16, 2011 at 10:07 AM
Well, one of the other interesting limits to complexity is that in seeking high rates of return, as limits to growth create large hidden liabilities, investors naturally look for investments with hidden liabilities that are hidden from *them*, like the global effects of demand persistently exceeding supply for food and fuel resources.
http://synapse9.com/blog/2011/03/16/the-difference-between-cash-cows-and-crash-cows/
Posted by: Phil Henshaw | March 17, 2011 at 11:10 AM
Matt,
We don't have a unified loom, its true. But I would argue that systems science could provide such a loom. What then would be missing is the intellectual and sapient horsepower to use it.
------------------------------
Phil,
Interesting point.
George
Posted by: George Mobus | March 23, 2011 at 11:24 AM
This should be required reading -- like all the other ones you post! It reminds me of the Wendel Berry article "Faustian Economics" available on Harper's web site. Why are we so afraid of limits? Is it because it reminds us of death?
Posted by: Erin Mooney | March 30, 2011 at 09:37 AM
Erin,
Great question. Our brains are very conflicted re: death. We have just enough foresight to know of our own demise. It isn't clear that other animals have this same kind of understanding of mortality. Old elephants and chimpanzees apparently do know when they are about to die; they disappear into the jungle to succumb. We humans, OTOH, cling to every minute we can get, as a rule.
AFA complexity limits, we have simply never really known any because we always found new sources of energy to keep growing on. Thus we are spoiled and expectations are that a new technology or energy source will allow us to go on as if there were no limits.
Logic and science seem not to count for much, do they.
George
Posted by: George Mobus | April 03, 2011 at 02:56 PM