Thinking in Systems

Introduction: The Systems Lens

A system is a set of things — people, cells, molecules, or whatever — interconnected in such a way that they produce their own pattern of behavior over time. The system may be buffeted, constricted, triggered, or driven by outside forces. But the system’s response to these forces is characteristic of itself, and that response is seldom simple in the real world.
The behavior of a system cannot be known just by knowing the elements of which the system is made.

One, The Basics

A system is an interconnected set of elements that is coherently organized in a way that achieves something. If you look at that definition closely for a minute, you can see that a system must consist of three kinds of things: elements, interconnections, and a function or purpose.
Is there anything that is not a system? Yes — a conglomeration without any particular interconnections or function.
If information-based relationships are hard to see, functions or purposes are even harder. A system’s function or purpose is not necessarily spoken, written, or expressed explicitly, except through the operation of the system. The best way to deduce the system’s purpose is to watch for a while to see how the system behaves.
Purposes are deduced from behavior, not from rhetoric or stated goals.
A system generally goes on being itself, changing only slowly if at all, even with complete substitutions of its elements — as long as its interconnections and purposes remain intact. ... If the interconnections change, the system may be greatly altered.

Three, Why Systems Work So Well

Resilience is a measure of a system’s ability to survive and persist within a variable environment.
Resilience is not the same thing as being static or constant over time. Resilient systems can be very dynamic.
And, conversely, systems that are constant over time can be unresilient. This distinction between static stability and resilience is important. Static stability is something you can see; it’s measured by variation in the condition of a system week by week or year by year. Resilience is something that may be very hard to see, unless you exceed its limits, overwhelm and damage the balancing loops, and the system structure breaks down. Because resilience may not be obvious without a whole-system view, people often sacrifice resilience for stability, or for productivity, or for some other more immediately recognizable system property.
I think of resilience as a plateau upon which the system can play, performing its normal functions in safety. A resilient system has a big plateau, a lot of space over which it can wander, with gentle, elastic walls that will bounce it back, if it comes near a dangerous edge. As a system loses its resilience, its plateau shrinks, and its protective walls become lower and more rigid, until the system is operating on a knife-edge, likely to fall off in one direction or another whenever it makes a move. Loss of resilience can come as a surprise, because the system usually is paying much more attention to its play than to its playing space.
Self-organization is such a common property, particularly of living systems, that we take it for granted. If we didn’t, we would be dazzled by the unfolding systems of our world. And if we weren’t nearly blind to the property of self-organization, we would do better at encouraging, rather than destroying, the self-organizing capacities of the systems of which we are a part. Like resilience, self-organization is often sacrificed for purposes of short-term productivity and stability.
Self-organization produces heterogeneity and unpredictability. It is likely to come up with whole new structures, whole new ways of doing things. It requires freedom and experimentation, and a certain amount of disorder. These conditions that encourage self-organization often can be scary for individuals and threatening to power structures.
Complex systems can evolve from simple systems only if there are stable intermediate forms.
When a subsystem’s goals dominate at the expense of the total system’s goals, the resulting behavior is called suboptimization. Just as damaging as suboptimization, of course, is the problem of too much central control. If the brain controlled each cell so tightly that the cell could not perform its self-maintenance functions, the whole organism could die.
To be a highly functional system, hierarchy must balance the welfare, freedoms, and responsibilities of the subsystems and total system — there must be enough central control to achieve coordination toward the large-system goal, and enough autonomy to keep all subsystems flourishing, functioning, and self-organizing.
Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers.

Four, Why Systems Surprise Us

Everything we think we know about the world is a model. Every word and every language is a model. All maps and statistics, books and databases, equations and computer programs are models. So are the ways I picture the world in my head — my mental models. None of these is or ever will be the real world.
It’s endlessly engrossing to take in the world as a series of events, and constantly surprising, because that way of seeing the world has almost no predictive or explanatory value.
The behavior of a system is its performance over time — its growth, stagnation, decline, oscillation, randomness, or evolution.
System structure is the source of system behavior. System behavior reveals itself as a series of events over time.
So the world often surprises our linear-thinking minds. If we’ve learned that a small push produces a small response, we think that twice as big a push will produce twice as big a response. But in a nonlinear system, twice the push could produce one-sixth the response, or the response squared, or no response at all.
We have to invent boundaries for clarity and sanity; and boundaries can produce problems when we forget that we’ve artificially created them.
It’s a great art to remember that boundaries are of our own making, and that they can and should be reconsidered for each new discussion, problem, or purpose. It’s a challenge to stay creative enough to drop the boundaries that worked for the last problem and to find the most appropriate set of boundaries for the next question. It’s also a necessity, if problems are to be solved well.
At any given time, the input that is most important to a system is the one that is most limiting.
Insight comes not only from recognizing which factor is limiting, but from seeing that growth itself depletes or enhances limits and therefore changes what is limiting.
Any physical entity with multiple inputs and outputs is surrounded by layers of limits.
For any physical entity in a finite environment, perpetual growth is impossible. Ultimately, the choice is not to grow forever but to decide what limits to live within.
There always will be limits to growth. They can be self-imposed. If they aren’t, they will be system-imposed.
Bounded rationality means that people make quite reasonable decisions based on the information they have. ... The bounded rationality of each actor in a system — determined by the information, incentives, disincentives, goals, stresses, and constraints impinging on that actor — may or may not lead to decisions that further the welfare of the system as a whole. If they do not, putting new actors into the same system will not improve the system’s performance. What makes a difference is redesigning the system to improve the information, incentives, disincentives, goals, stresses, and constraints that have an effect on specific actors.
You won’t get your way with the system, but it won’t go as far in a bad direction as you think, because much of the action you were trying to correct was in response to your own action. If you calm down, those who are pulling against you will calm down too.

Five, System Traps…and Opportunities

The most effective way of dealing with policy resistance is to find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality.
The tragedy of the commons arises from missing (or too long delayed) feedback from the resource to the growth of the users of that resource.
...there is a distinction between the actual system state and the perceived state. The actor tends to believe bad news more than good news. As actual performance varies, the best results are dismissed as aberrations, the worst results stay in the memory. The actor thinks things are worse than they really are. And to complete this tragic archetype, the desired state of the system is influenced by the perceived state. Standards aren’t absolute.
Drift to low performance is a gradual process. If the system state plunged quickly, there would be an agitated corrective process. But if it drifts down slowly enough to erase the memory of (or belief in) how much better things used to be, everyone is lulled into lower and lower expectations, lower effort, lower performance.
There are two antidotes to eroding goals. One is to keep standards absolute, regardless of performance. Another is to make goals sensitive to the best performances of the past, instead of the worst.
One way out of the escalation trap is unilateral disarmament — deliberately reducing your own system state to induce reductions in your competitor’s state. ... The only other graceful way out of the escalation system is to negotiate a disarmament. That’s a structural change, an exercise in system design.
This system trap [success to the successful] is found whenever the winners of a competition receive, as part of the reward, the means to compete even more effectively in the future.
Addiction is finding a quick and dirty solution to the symptom of the problem, which prevents or distracts one from the harder and longer-term task of solving the real problem. Addictive policies are insidious, because they are so easy to sell, so simple to fall for.
The problem can be avoided up front by intervening in such a way as to strengthen the ability of the system to shoulder its own burdens. This option, helping the system to help itself, can be much cheaper and easier than taking over and running the system — something liberal politicians don’t seem to understand. The secret is to begin not with a heroic takeover, but with a series of questions.
If you are the intervenor, work in such a way as to restore or enhance the system’s own ability to solve its problems, then remove yourself. If you are the one with an unsupportable dependency, build your system’s own capabilities back up before removing the intervention. Do it right away. The longer you wait, the harder the withdrawal process will be.
The “letter of the law” is met, the spirit of the law is not [rule beating].
If you define the goal of a society as GNP, that society will do its best to produce GNP. It will not produce welfare, equity, justice, or efficiency unless you define a goal and regularly measure and report the state of welfare, equity, justice, or efficiency. The world would be a different place if instead of competing to have the highest per capita GNP, nations competed to have the highest per capita stocks of wealth with the lowest throughput, or the lowest infant mortality, or the greatest political freedom, or the cleanest environment, or the smallest gap between the rich and the poor.

Six, Leverage Points: Places to Intervene in a System

Leverage points frequently are not intuitive. Or if they are, we too often use them backward, systematically worsening whatever problems we are trying to solve.
Putting different hands on the faucets may change the rate at which the faucets turn, but if they’re the same old faucets, plumbed into the same old system, turned according to the same old information and goals and rules, the system behavior isn’t going to change much.
It’s not that parameters aren’t important—they can be, especially in the short term and to the individual who’s standing directly in the flow. People care deeply about such variables as taxes and the minimum wage, and so fight fierce battles over them. But changing these variables rarely changes the behavior of the national economy system. If the system is chronically stagnant, parameter changes rarely kick-start it. If it’s wildly variable, they usually don’t stabilize it. If it’s growing out of control, they don’t slow it down.
Parameters become leverage points when they go into ranges that kick off one of the items higher on this list. ... System goals are parameters that can make big differences.
Missing information flows is one of the most common causes of system malfunction. Adding or restoring information can be a powerful intervention, usually much easier and cheaper than rebuilding physical infrastructure.
Self-organization means changing any aspect of a system lower on this list—adding completely new physical structures, such as brains or wings or computers—adding new balancing or reinforcing loops, or new rules. The ability to self-organize is the strongest form of system resilience. A system that can evolve can survive almost any change, by changing itself. The human immune system has the power to develop new responses to some kinds of insults it has never before encountered. The human brain can take in new information and pop out completely new thoughts.
Insistence on a single culture shuts down learning and cuts back resilience. Any system, biological, economic, or social, that gets so encrusted that it cannot self-evolve, a system that systematically scorns experimentation and wipes out the raw material of innovation, is doomed over the long term on this highly variable planet.
The intervention point here is obvious, but unpopular. Encouraging variability and experimentation and diversity means “losing control.”
That is to keep oneself unattached in the arena of paradigms, to stay flexible, to realize that no paradigm is “true,” that every one, including the one that sweetly shapes your own worldview, is a tremendously limited understanding of an immense and amazing universe that is far beyond human comprehension. It is to “get” at a gut level the paradigm that there are paradigms, and to see that that itself is a paradigm, and to regard that whole realization as devastatingly funny. It is to let go into not-knowing, into what the Buddhists call enlightenment.
... there is no power, no control, no understanding, not even a reason for being, much less acting, embodied in the notion that there is no certainty in any worldview.
The higher the leverage point, the more the system will resist changing it
Magical leverage points are not easily accessible, even if we know where they are and which direction to push on them. There are no cheap tickets to mastery. You have to work hard at it, whether that means rigorously analyzing a system or rigorously casting off your own paradigms and throwing yourself into the humility of not-knowing. In the end, it seems that mastery has less to do with pushing leverage points than it does with strategically, profoundly, madly, letting go and dancing with the system.
~
  1. Transcending Paradigms
  2. Paradigms—The-mind-set out of which the system—its goals, structure, rules, delays, parameters—arises
  3. Goals—The purpose or function of the system
  4. Self-organization—The power to add, change, or evolve system structure
  5. Rules—Incentives, punishments, constraints
  6. Information Flows—The structure of who does and does not have access to information
  7. Reinforcing Feedback Loops—The strength of the gain of driving loops
  8. Balancing Feedback Loops—The strength of the feedbacks relative to the impacts they are trying to correct
  9. Delays—The lengths of time relative to the rates of system changes
  10. Stock-and-Flow Structures—Physical systems and their nodes of intersection
  11. Buffers—The sizes of stabilizing stocks relative to their flows
  12. Numbers—Constants and parameters such as subsidies, taxes, standards
~

Seven, Living in a World of Systems

Magical leverage points are not easily accessible, even if we know where they are and which direction to push on them. There are no cheap tickets to mastery. You have to work hard at it, whether that means rigorously analyzing a system or rigorously casting off your own paradigms and throwing yourself into the humility of not-knowing. In the end, it seems that mastery has less to do with pushing leverage points than it does with strategically, profoundly, madly, letting go and dancing with the system.
Self-organizing, nonlinear, feedback systems are inherently unpredictable. They are not controllable. They are understandable only in the most general way. The goal of foreseeing the future exactly and preparing for it perfectly is unrealizable. The idea of making a complex system do just what you want it to do can be achieved only temporarily, at best. We can never fully understand our world, not in the way our reductionist science has led us to expect.
Our science itself, from quantum theory to the mathematics of chaos, leads us into irreducible uncertainty. For any objective other than the most trivial, we can’t optimize; we don’t even know what to optimize. We can’t keep track of everything. We can’t find a proper, sustainable relationship to nature, each other, or the institutions we create, if we try to do it from the role of omniscient conqueror.
We don’t have to change anyone’s values, we just have to get the system to operate around real values. ... What causes a person or a society to give up on attaining “real values” and to settle for cheap substitutes? ... How do systems create cultures? How do cultures create systems? Once a culture and system have been found lacking, do they have to change through breakdown and chaos? ... Why are people so easily convinced of their powerlessness?
The future can’t be predicted, but it can be envisioned and brought lovingly into being. Systems can’t be controlled, but they can be designed and redesigned. We can’t surge forward with certainty into a world of no surprises, but we can expect surprises and learn from them and even profit from them. We can’t impose our will on a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone.
Living successfully in a world of systems requires more of us than our ability to calculate. It requires our full humanity—our rationality, our ability to sort out truth from falsehood, our intuition, our compassion, our vision, and our morality.
Before you disturb the system in any way, watch how it behaves. If it’s a piece of music or a whitewater rapid or a fluctuation in a commodity price, study its beat. If it’s a social system, watch it work. Learn its history. Ask people who’ve been around a long time to tell you what has happened. If possible, find or make a time graph of actual data from the system—peoples’ memories are not always reliable when it comes to timing.
Starting with the behavior of the system forces you to focus on facts, not theories. It keeps you from falling too quickly into your own beliefs or misconceptions, or those of others. ... Starting with the history of several variables plotted together begins to suggest not only what elements are in the system, but how they might be interconnected. ... Starting with history discourages the common and distracting tendency we all have to define a problem not by the system’s actual behavior, but by the lack of our favorite solution.
Mental flexibility—the willingness to redraw boundaries, to notice that a system has shifted into a new mode, to see how to redesign structure—is a necessity when you live in a world of flexible systems.
Honoring information means above all avoiding language pollution—making the cleanest possible use we can of language. Second, it means expanding our language so we can talk about complexity.
"In fact, we don’t talk about what we see; we see only what we can talk about."
Fred Kofman
The language and information systems of an organization are not an objective means of describing an outside reality—they fundamentally structure the perceptions and actions of its members.
"My impression is that we have seen, for perhaps a hundred and fifty years, a gradual increase in language that is either meaningless or destructive of meaning. And I believe that this increasing unreliability of language parallels the increasing disintegration, over the same period, of persons and communities.…" Fred Kofman
The first step in respecting language is keeping it as concrete, meaningful, and truthful as possible—part of the job of keeping information streams clear. The second step is to enlarge language to make it consistent with our enlarged understanding of systems.
Our culture, obsessed with numbers, has given us the idea that what we can measure is more important than what we can’t measure. Think about that for a minute. It means that we make quantity more important than quality. If quantity forms the goals of our feedback loops, if quantity is the center of our attention and language and institutions, if we motivate ourselves, rate ourselves, and reward ourselves on our ability to produce quantity, then quantity will be the result.
If something is ugly, say so. If it is tacky, inappropriate, out of proportion, unsustainable, morally degrading, ecologically impoverishing, or humanly demeaning, don’t let it pass. Don’t be stopped by the “if you can’t define it and measure it, I don’t have to pay attention to it” ploy. No one can define or measure justice, democracy, security, freedom, truth, or love. No one can define or measure any value. But if no one speaks up for them, if systems aren’t designed to produce them, if we don’t speak about them and point toward their presence or absence, they will cease to exist.
Especially where there are great uncertainties, the best policies not only contain feedback loops, but meta-feedback loops—loops that alter, correct, and expand loops. These are policies that design learning into the management process.
Don’t maximize parts of systems or subsystems while ignoring the whole. Don’t, as Kenneth Boulding once said, go to great trouble to optimize something that never should be done at all. Aim to enhance total systems properties, such as growth, stability, diversity, resilience, and sustainability—whether they are easily measured or not.
Aid and encourage the forces and structures that help the system run itself. Notice how many of those forces and structures are at the bottom of the hierarchy. Don’t be an unthinking intervenor and destroy the system’s own self-maintenance capacities. Before you charge in to make things better, pay attention to the value of what’s already there.
That’s a guideline [responsibility] both for analysis and design. In analysis, it means looking for the ways the system creates its own behavior. Do pay attention to the triggering events, the outside influences that bring forth one kind of behavior from the system rather than another.
“Intrinsic responsibility” means that the system is designed to send feedback about the consequences of decision making directly and quickly and compellingly to the decision makers.
... enough to get you thinking about how little our current culture has come to look for responsibility within the system that generates an action, and how poorly we design systems to experience the consequences of their actions.
“Stay the course” is only a good idea if you’re sure you’re on course. Pretending you’re in control even when you aren’t is a recipe not only for mistakes, but for not learning from mistakes. What’s appropriate when you’re learning is small steps, constant monitoring, and a willingness to change course as you find out more about where it’s leading.
[Celebrate Complexity] Only a part of us, a part that has emerged recently, designs buildings as boxes with uncompromising straight lines and flat surfaces. Another part of us recognizes instinctively that nature designs in fractals, with intriguing detail on every scale from the microscopic to the macroscopic. That part of us makes Gothic cathedrals and Persian carpets, symphonies and novels, Mardi Gras costumes and artificial intelligence programs, all with embellishments almost as complex as the ones we find in the world around us.
"There is a great deal of historical evidence to suggest that a society which loses its identity with posterity and which loses its positive image of the future loses also its capacity to deal with present problems, and soon falls apart.…"  Kenneth Boulding
When you’re walking along a tricky, curving, unknown, surprising, obstacle-strewn path, you’d be a fool to keep your head down and look just at the next step in front of you. You’d be equally a fool just to peer far ahead and never notice what’s immediately under your feet. You need to be watching both the short and the long term—the whole system.
Interdisciplinary communication works only if there is a real problem to be solved, and if the representatives from the various disciplines are more committed to solving the problem than to being academically correct.
We know what to do about drift to low performance. Don’t weigh the bad news more heavily than the good. And keep standards absolute.