Buy the Book: Print
Systemantics humorously describes how complex systems work and why they are far from perfect. The author’s systems axioms, listing the possible flaws and failure points, are reasons why humility and skepticism, rather than confidence, are needed when dealing with systems.
The Notes
- Systemism: the state of mindless belief in systems; the belief that systems can be made to function to achieve desired goals.
- Everyone likes to point out the problems in “the system.” They have solutions, and if only their ideas were adopted, their “system” would work better. The flaw is in not understanding how systems work…and don’t work.
- “The fundamental problem does not lie in any particular system but rather in systems as such. Salvation, if it is attainable at all, even partially, is to be sought in a deeper understanding of the ways of systems, not simply in a criticism of the errors of a particular system.”
- Most people are overly concerned about the end result, when they should focus more on the process that produced the outcome.
- “No one can afford not to understand the basic principles of how systems work. Ignorance of those basic laws is bound to lead to unrealistic expectations of the type that have plagued dreamers, schemers, and so-called men of affairs from the earliest times.”
- Murphy’s Law — if it can go wrong it will — is common to systems. Systems act up.
- “Human systems are not prevented from working by some single, hidden defect, whether of communication or anything else. Failure to function as expected is an intrinsic feature of systems, resulting from laws of systems-behavior that are as rigorous as any in Natural Science or Mathematics.”
- The creation of a new system to solve a problem is bound to create unanticipated new problems. The example of a system for garbage collection leads to problems of collective bargaining, truck maintenance, how to fund it, voter approval, what to do during poor weather or national holidays, separating trash before pickup, etc.
- Anergy:
- “Any state of condition of the Universe, or of any portion of it, that requires the expenditure of human effort or ingenuity to bring it into line with human desires, needs, or pleasures is defined as an Anergy-state. Anergy is measured in units of effort required to bring about the desired change.”
- Law of the Conservation of Anergy: “The total amount of anergy in the universe is fixed.”
- “Systems operate by redistributing anergy into different forms and into accumulations of different sizes.”
- Anergy is the opposite of energy. Anergy resides within mess situations as energy resides within a coiled spring. A coiled spring is full of energy. When uncoiled, it is full of anergy.
- “In really large and ambitious systems, the original problem may persist unchanged and at the same time a multitude of new problems arise to fester (or ferment) unresolved.”
- “Systems are like babies: once you get one, you have it. They don’t go away. On the contrary, they display the most remarkable persistence. They not only persist; they grow. And as they grow, they encroach… Administrative systems maintain an average rate of growth of 5 to 6 percent per annum regardless of the work to be done.”
- “In the United States, the Internal Revenue Service, not only collect our taxes, they also make us compute the tax for them, an activity that exacts an incalculable cost in sweat, tears, and agony and takes years off our lives as we groan over their complicated forms.”
- We should expect the unexpected. “Things not only don’t work out well, they work out in strange, even paradoxical, ways. Our plans not only go awry, they produce results we never expected. Indeed they often produce the opposite result from the one intended.”
- Harvard Law of Animal Behavior: “Under precisely controlled experimental conditions, a test animal will behave as it damn well pleases.”
- “The behavior or complex systems, generally, whether living or nonliving, is unpredictable.”
- “As long as a system exists only in the head of its creator, we would agree that it might be knowable in all its implications. But once that system is translated into the real world, into hardware and people, it becomes something else. It becomes a real-world thing, and mere mortals can never know all there is to know about the real world.”
- Le Chatelier’s Principle: a change in a variable that determines a system’s equilibrium produces a new equilibrium.
- “Goals and Objectives” statements are often used to measure “success.” Yet, the statements often bind the writer while eliminating the flexibility needed in any creative endeavors because deviation from the “goals and objectives” is considered a “failure” by those who instituted the statements in the first place.
- “The natural tendency of systems is to set up negative feedback to their own operations.”
- Positive feedback, in a poorly functioning system, only encourages more of the same.
- Oscillating systems alternate between positive and negative feedback, creating an endless cycle bouncing between two extreme states. The length of the cycle is often generational — long enough for the new generation to not know how things worked out the last time someone tried their way of doing things.
- The more complex the system, the more specific skills are needed to run different areas of the system. Complex systems require specialization, whereas simple systems allow for broad skills.
- “In general, the larger and more complex the system, the less the resemblance between the true function and the name it bears.”
- “Most of the things we humans desire are nonsystem things.”
- Operational Fallacy: complex systems do not do what they claim to do and people in complex systems do not what the system says they do.
- “The function performed by a system is not operationally identical to the function of the same name performed by a man. In general, a function performed by a larger system is not operationally identical to the function of the same name as performed by a smaller system.”
- “The function (or product) is defined by the systems-operations that occur in its performance or manufacture.”
- Fundamental Law of Administrative Workings (FLAW): Things are not what they are reported to be.
- Or the real world is what is reported to the system.
- “The net effect of this Law is to ensure that people in systems are never dealing with the real world that the rest of us have to live in but with a filtered, distorted, and censored version which is all that can get past the sensory organs of the system itself.”
- “A system is no better than its sensory organs.”
- “To those within a system, the outside tends to pale and disappear.”
- “The bigger the system, the narrower and more specialized the interface with individuals.” i.e. people become numbers, accounts, etc.
- Coefficient of Fiction: R(0)/R(s), where:
- R(0) = the amount of reality that fails to the relevant people.
- R(s) = the total amount of reality input to the system.
- A Coefficient of Fiction above 0.99 is possible in certain delusional systems where reality is cut off entirely.
- Positive feedback is inversely correlated to reality. As positive feedback rises, the amount of reality going into the system falls, thus increasing the coefficient of fiction.
- People in a system rise to their level of incompetence.
- “Immersion in a system tends to produce an altered state that results in various bizarre malfunctions, recognizable to us but not to the people in the system.”
- Functionary’s Fault: “A complex function induced in a system-person by the system itself, and primarily attributable to sensory deprivation.”
- Functionary’s Pride: “A kind of mania of self-esteem induced by titles and the illusion of power.”
- Hireling’s Hypnosis: “A trance-like state, a suspension of normal mental activity, induced by membership within a system.”
- “Systems-delusions are the delusion systems that are almost universal in our modern world. Wherever a system is, there is also a systems-delusion, the inevitable result of the Operational Fallacy and the FLAW in systems.”
- Manager’s Mirage: “The belief that some event (usually called an outcome) was actually caused by the operation of the system.” or “The system takes the credit (for what would probably have happened anyway).”
- Orwell’s Inversion: “The confusion of input and output.”
- Systems People:
- “The System has its effects on the people within it. It isolates them, feeds them a distorted and partial version of the outside world, and gives them the illusion of power and effectiveness.”
- “Systems attract and keep those people whose attributes are such as to make them adapted to life in the system.”
- “While systems-people share certain attributes in common, each specific system tends to attract people with specific sets of attributes… The System calls forth those attributes in its members and rewards the extreme degrees of them.”
- “A priori guesses as to what traits are fostered by a given system are likely to be wrong. Furthermore, those traits are not necessarily conducive to successful operation of the System itself.”
- “Systems attract not only Systems-people who have attributes for success within the system. They also attract individuals who possess specialized attributes adapted to allow them to thrive at the expense of the system… These persons attach themselves to systems, getting a free ride and a free lunch as long as the system survives.”
- Trying to make nonfunctioning systems work better doesn’t help.
- “Simple systems that work are rare and precious additions to the armamentarium of human technology. They should be treasured. Unfortunately, they often possess attributes of instability requiring special skill in their operation.”
- “Nothing is more useless than struggling against a law of nature.”
- Functional Indeterminacy Theorem: In complex systems, malfunction and even total nonfunction may not be detectable for long periods, if ever.
- “The problem of evaluating ‘success’ or ‘failure’ as applied to large systems is compounded by the difficulty of finding proper criteria for such evaluations.”
- “In general, we can say that the large the system becomes, the more the parts interact, the more difficult it is to understand environmental constraints, the more obscure becomes the problem of what resources should be made available, and deepest of all, the more difficult becomes the problems of the legitimate values of the system.” — C. West Churchman, The Systems Approach
- “When a system continues to do its own thing, regardless of circumstances, we may be sure that it is acting in pursuit of inner goals.”
- The larger and more complex a system gets, the more likely that its component parts begin to fail. How does it fail? How well does it work when each part fails?
- The crucial variables in any system are likely to be discovered by accident.
- Systems Design:
- 1st Principle: Do it without a system if possible.
- “Systems are seductive. They promise to do a hard job faster, better, and more easily than you could do it by yourself. But if you set up a system, you are likely to find your time and effort now being consumed in the care and feeding of the system itself.”
- “Many functions are intrinsically unsuited to the systems approach.”
- “The great secret of systems design is to be able to sense what things can naturally be done easily and elegantly by means of a system and what things are hard — and to stay ways from the hard things.”
- “Remember the Systems Laws of Gravity, otherwise known as the Vector Theory of Systems: Systems run best when designed to run downhill. In human terms, this means working with human tendencies rather than against them.”
- Ex: Lotteries work regardless of economic conditions because it sides with human nature’s willingness to gamble a small amount for a huge payoff.
- “Don’t make the system too tight. This is usually done in the name of efficiency, or (paradoxically) in the hope of making the system more permanent.”
- “Consider, for example, the System of the Family. The Family has been around for a long time. Our close primate relatives, the gorillas, form family units consisting of husband and wife and one or more offspring. As Jane Goodall has shown, gorillas take naps after meals. (Every day is Sunday for large primates.) The youngsters wake up too soon, get bored and start monkeying around the nest. Father gorilla eventually wakes up, leans on one elbow, and fixes the errant youngster with a penetrating stare that speaks louder than words. The offending juvenile thereupon stops his irritating hyperactivity, at least for a few minutes. Clearly, this is a functioning family system. Its immense survival power is obvious. It has weathered vicissitudes compared to which the stresses of our own day are trivial. And what are the sources of its strength? They are extreme simplicity of structure; looseness in everyday functioning; “inefficiency” in the efficiency-expert’s sense of the term; and a strong alignment with basic primate motivations.”
- Potemkin Village Effect: “Is especially pronounced in Five-Year Plans, which typically report sensational overachievement during the first four and a half years, followed by a rash of criminal trials of the top officials and the announcement of a new and more ambitious Five-Year Plan, starting from a baseline somewhat lower than that of the preceding Plan, but with higher goals.”
- Catalytic Managership: “Based on the premise that trying to make something happen is too ambitious and usually fails, resulting in a great deal of wasted effort and lowered morale. It is, however, sometimes possible to remove obstacles in the way of something happening. A great deal may then occur with little effort on the part of the manager, who nevertheless gets a large part of the credit.”
- “A system that is sufficiently large, complex, and ambitious can reduce output far below ‘random’ levels. Thus, a Federal Program to Conquer Cancer may tie up all the competent researchers in the field, leaving the problem to be solved by someone else, typically a graduate student from the University of Tasmania doing a little recreational entomology on her vacation. Solutions usually come from people who see in the problem only an interesting puzzle, and whose qualifications would never satisfy a select committee.”
- “The Token System is intended to provide for distribution of wealth along certain rational lines, such as the contribution of the individual to the common welfare. In practice, however, and in accordance with an ineluctable natural law of Systems-behavior, the tokens are accumulated by those whose primary virtue is skill in accumulating tokens — a point overlooked by Marx.”
- Garbage is a byproduct of a system. One system’s garbage may be another system’s raw material.
- Experts are people who know facts except those necessary to ensure the successful functioning of the system.
- Creativity is about finding a simple system for a new problem or reworking an existing problem to be solved more simply.
- Systems Axioms
- Systems in general work poorly or not at all.
- New systems generate new problems.
- Corollary: Systems should not be unnecessarily multiplied.
- Systems operate by redistributing anergy into different forms and into accumulations of different sizes.
- Systems tend to grow, and as they grow, they encroach.
- Alt: The system itself tends to expand at 5 to 6 percent per annum.
- Systems display antics.
- Alt: Complicated systems produce unexpected outcomes. The total behavior of systems cannot be predicted.
- Corollary: A large system, produced by expanding the dimensions of a smaller system, does not behave like the smaller system.
- Complex systems tend to oppose their own proper function.
- Alt: Systems get in the way. The system always kicks back.
- Corollary: Positive feedback is dangerous.
- People in systems do not do what the system says they are doing.
- The system itself does not do what it says it is doing.
- Alt: The function performed by a system is not operationally identical to the function of the same name performed by a man.
- Alt: The function performed by a larger system is not operationally identical to the function of the same name performed by a smaller system.
- Things are what they are reported to be. The real world is whatever is reported to the system.
- Corollary: A system is no better than its sensory organs. To those within a system, the outside reality tends to pale and disappear.
- Systems attract systems-people.
- Corollary: For every human system, there is a type of person adapted to thrive on it or in it.
- The bigger the system, the narrower and more specialized the interface with individuals.
- A complex system cannot be “made” to work. It either works or it doesn’t.
- Corollary: Pushing on the system doesn’t help. It just makes things worse.
- A simple system, designed from scratch, sometimes works.
- Alt: Simple systems may or may not work.
- Some complex systems actually work.
- Rule of Thumb: If a system is working, leave it alone.
- A complex system that works is invariably found to have evolved from a simple system that works.
- A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over, beginning with a working simple system.
- Alt: Programs never run the first time. Complex programs never run.
- In complex systems, malfunctions and even total nonfunction may not be detectable for long periods, if ever.
- Large complex systems are beyond human capacity to evaluate.
- A system that performs a certain function or operates in a certain way will continue to operate in that way regardless of the need or of changed conditions.
- Alt: A system continues to do its thing, regardless of need.
- Systems develop goals of their own the instant they come into being.
- Intrasystem goals come first.
- Complex systems usually operate in failure mode.
- A complex system can fail in an infinite number of ways.
- The mode of failure of a complex system cannot ordinarily be predicted from its structure.
- The crucial variables are discovered by accident.
- The larger the system, the greater the probability of unexpected failure.
- “Success” or “Function” in any system may be failure in the larger or smaller systems to which the system is connected.
- Corollary: In setting up a new system, tread softly. You may be disturbing another system that is actually working.
- When a fail-safe system fails, it fails by failing to fail-safe.
- Complex systems tend to produce complex responses (not solutions) to problems.
- Great advances are not produced by systems designed to produce great advances.
- Systems run better when designed to run downhill.
- Corollary: Systems aligned with human motivational vectors will sometimes work. Systems opposing such vectors work poorly or not at all.
- Loose systems last longer and work better.
- Corollary: Efficient systems are dangerous to themselves and to others.
- Advanced Systems Theories:
- Everything is a system.
- Everything is part of a larger system.
- The universe is infinitely systematized, both upward (larger systems) and downward (smaller systems).
- All systems are infinitely complex. (The illusion of simplicity comes from focusing attention on one or a few variables.)
Buy the Book: Print