Skip to content

Book Review-Thinking in Systems: A Primer

When I reread The Fifth Discipline and wrote a book review for it, I found a few references to Donella Meadows and upon further research I found the book Thinking in Systems. The book is odd in that it was published posthumously. The book draft was written in 1993 but was never published. She unexpectedly died in 2001 and in 2008 the book was finally brought to print. I’m glad that it did. As much as Singe’s work made systems thinking popular, Meadows work was much richer and deeper than was possible with The Fifth Discipline.

It’s possible to believe at first glance that learning from The Fifth Discipline might be enough, but systems thinking was only one of the disciplines – it didn’t get the depth that it deserves as a way of thinking that can change your effectiveness.

Essence of a System

Years ago while speaking about databases terms we talked about atomic transactions. That is that a transaction should either completely succeed or fail. One of the problems with early file-based ISAM databases was a record being updated in one file and not in another. The modern SQL based databases solve this problem by wrapping changes to a set of tables into a single transaction – as single atomic operation – that will either succeed or fail. It will never end up partially updating tables. You can’t extract just part of the complex system update, you have to take it all – or none of it.

Systems are the same way. They have an all or nothing component of them. You can’t take one part of a system and manipulate it without having an impact on something else. We saw this in Diffusion of Innovations and The Fifth Discipline as the law of unintended consequences. You have to take the whole of the system you can’t take the pieces. However, in order to approach an understanding of the system – in order to create a mental model of the system (See Sources of Power and The Fifth Discipline) – You’ll have to try to decompose the system into understandable components. Those components are the elements of the system, the way that they’re connected, and the purpose of the system. In systems thinking, the system is more than a sum of the parts.

This is part of why wicked problems (See Dialogue Mapping and The Heretic’s Guide to Best Practices) are wicked. You can’t work on just one part of the problem, you have to work on the entire system at once – and all of the relationships between all of the parts.

Put Out the Fire

Elements, Connections, and Purpose

Taking apart a system should be easy. Nearly everything that we have ever run across can be dissected into smaller and smaller units. A person can be broken down into organs, cells, molecules, atoms, protons, quarks, etc. We’ve become adept at identifying the component parts of a system. It’s not surprising that being able to see the individual elements of the system is the easiest part of the process of understanding a system. In most cases elements are tangible. Even when they’re not tangible – they’re things like pride and identity – they’re still relatively easy for our concrete focused minds to understand. (See Pervasive Information Architecture for our need to learn through concrete things.)

It’s often harder to see the interconnections between the elements in a system. If you’re looking at an airplane you can see the engines and the trust and you can observe the plane in flight, but it’s not obvious how the shape of the wings (camber) relates to the amount of lift generated. The thrust that the engines are producing is being converted into lift by the wings, but in a non-obvious way. The only hope for seeing connections between different parts of the system is to observe the system in action.

More difficult to see is the purpose of the system. In the case of an airplane the fact that it’s used for transportation is non-obvious at first glance – when you look at the collection of components. Of course we all know that airplanes are designed for transportation – but that’s because we’ve seen them in action.

Consider the raging debates about what Stonehenge was created for. There numerous theories about what the rocks were used for. We know that they’re rocks (elements) and we can see the connections (orientation) but because we’ve not been able to see their original use we still don’t know for certain what their purpose was. The best (if not only) way to deduce the purpose of a system is to see it in operation.

Of these, components of a system the elements component is the one where they are easiest to see. However, elements have the least impact. The connections and the purpose are much more important. The purpose is also the most difficult to observe. In fact, Paul Ekman, who developed a mechanism for determining a person’s emotional state through micro expressions says that you can accurately observe the emotion but you cannot know why they felt the feeling. (See Social Engineering, Trust Me, and Emotional Awareness for more about Ekman’s work)

Some systems have espoused purposes which don’t match their actual purposes. Consider, for instance, a buy-here-pay-here car lot. The espoused purpose is to sell cars. However, the actual purpose is to provide high-risk loans to people. These high-risk loans come with higher interest rates designed to be profitable for the car dealer. However, many of these car lots’ marketing schemes are targeted at providing freedom to people who wouldn’t otherwise be able to buy a car. It’s possible that there’s one espoused purpose – and an entirely different actual purpose.

Seeing purpose in the behavior of a system is somewhat like trying to see the wind blow. You can’t directly see which way the wind is blowing, however, as a pilot knowing the wind direction and speed can be important. That’s why pilots are taught to get winds reports – but also to look at smoke from smokestacks and the ways that trees are bending in the wind to be able to get an approximation of the way that the wind is blowing through indirect means.

One of the challenges that happens when trying to infer the behavior of a system is that you believe that What You See is all there is (WYSIATI). (See Thinking, Fast and Slow) However, in the systems world what you see is like what you see of an iceberg. (About 1/7th of the total size of an iceberg is above the water.) There is so much that isn’t directly visible without careful observations. For instance, you can see a duck on the surface of the water but you can’t see how much – or little – the duck is paddling underneath the water. In this way, even observing a river doesn’t make transparent all of the factors influencing it.

Stocks and Flows

Systems consist of stocks and flows. Stocks are batteries or reserve. They are the storehouse of the system. Flows are the inputs and outputs to that stock. For instance, the Dead Sea has a flow of the Jordan River as an input. It also has rain as an input. It has an output of evaporation. There is no other outlet of the Dead Sea. The stock is the size of the Dead Sea itself –the water contained in it.

As humans we’re more aware of stocks than we are flows. That’s one of the reasons why we’re more concerned about our bank account balances than the amount we pay for cable each month. Cable is an outward flow. Our bank balance is our stock of money. One of the reasons why people who make a lot of money spend a lot of money is because of our focus on the stocks. If we have a stock of money than our outward flows must be acceptable.

Delays

A child’s ability to delay gratification is a better measure of future success than any IQ test. The simple test of whether a child could wait a few minutes for two marshmallows instead of one could indicate better than sophisticated instruments designed to measure a child’s intelligence, their success later in life. (See How Children Succeed, Introducing Psychology of Success, and The Information Diet for Walter Mischel’s famous test.) It seems our ability to see – and accept – delays is an important part of our success in this world. That is likely because our world is alive with systems all interacting with one another and all of them with inherent delays.

Every system has delays in it. The instant there is a change the system doesn’t instantly change. You take an action and then there’s a delay before the response. The classic example of this is the temperature of a shower as I mentioned in my review of The Fifth Discipline. However, delays exist in all of our everyday systems. Consider your paycheck. When you work you don’t instantly get a paycheck. In most cases it takes two or three weeks before you get your paycheck for the time you’ve worked.

The more steps in a system that something must go through, the longer the delay. I had the pleasure recently of meeting a rodeo champion. He’s a trainer of students trying to become better at Rodeo but also of horses. One of the things that he explained was that the horses were trained to follow the bulls. The reason for this is simple. If the horse follows the bull automatically then the rider can focus on the rope and not have to worry about guiding the horse. This eliminates the inherent delay between the rider recognizing the path of the bull and signaling the horse to go in that direction. (It also speaks to not having information overload as The Information Diet.) Pervasive Information Architecture spoke of how giving actors real wine instead of colored water allowed them to focus on their performance and not have to deal with the distraction of pretending colored water was wine.

However, more than allowing the rider to focus on the rope, a delay – the delay between recognizing the motion of the bull and correcting the horse’s motion – has been removed. The reduced delay minimizes the oscillations and optimizes the system to improve performance.

Optimization and Resilience

Any competent financial analyst will tell you to spread out your investments into a variety of industries and types of investment. This is the process of adding resilience to your portfolio. You can make one big bet on a particular market, industry, or even company. However, the risks are big with this approach.

However, this is what we often do in optimization. We make one big bet on how a technology will work. We eliminate the parts of the system that we don’t think are necessary because it’s just taking energy to keep operational and isn’t returning value. However, when problems arise those critical components which were cut can’t kick in to keep the system from collapsing. What is most frequently cut when optimizing is anything that’s not directly related to output but sometimes the secondary loops and support structure is necessary to support the system once things start to go wrong.

During the United States housing bubble financial instruments that were intended to be stable based on the way they were constructed were slowly rearranged and in the process the security offered by them were no longer resilient against market downturns. As a result security organizations like Bear Stearns failed during the end of the housing bubble. An optimization towards returns had removed the resiliency from the system.

The resilience in the system of financial markets is in the form of fail safes against individuals taking actions that can be individually beneficial but harmful to the overall market – like offering home loans to people who couldn’t afford them. It looks great to the people getting a loan on the house but it led to the balancing loops in the system being insufficient to catch the fall. Part of that is based on the bounded rationality of the individual players in the system.

Bounded Rationality

Economists – despite popular belief – don’t study money. They study people’s behavior around money. It may seem like a subtle distinction but it’s not. Why are people willing to pay $2 for a bottle of water at the airport? Because the TSA made you throw out the bottle you had when you went to the gate – and because you’re thirsty and that’s the only option. Or is it? There’s a water fountain that’s a few hundred feet away.

The logical, economic choice is to use the water fountain to get water. However, people pay for the convenience of having a bottle of water. Some folks actually pay for the extra purification but mostly its convenience. From an economic point of view this doesn’t make sense. Why would you hand over your hard earned money for something you could get for free? But we do it all the time.

Bounded rationality is about the rational decisions that we make based on the information at hand. We believe that what we see is all there is. Bounded rationality allows for some seemingly nonsensical results. Consider the tragedy of commons. We know that adding another cow to our herd improves our profitability. The cows graze off of common land. Surely one more cow won’t be a problem – until all of the other farmers make the same individually rational decision. If this happens then the commons will be overgrazed and there won’t be enough food for everyone’s cows.

The solution to minimizing the impact of bounded reality is to create greater awareness of the overall system and how it works. Understanding more about how a system works leads you to the opportunity to leverage the system to make changes that are in everyone’s best interests.

12 Leverage Points

Thinking in Systems ends with 12 leverage points in reverse order of impact as follows:

  • Numbers—Constants and parameters such as subsidies, taxes, standards. These are the most commonly attempted ways to manipulate systems and consequently there is generally very little change here. One exception is on a discontinuity where a small numeric change causes the system to operate differently.
  • Buffers—The sizes of stabilizing stocks relative to their flows. Reducing buffers increases efficiency (optimization) by reducing resilience. Vendor managed inventory reduced the amount of inventory at distribution but this leads to more outages.
  • Stock-and-Flow Structures—Physical systems and their nodes of intersection. Basically this is rebuilding the system which can be very effective but rebuilding a system is also difficult to accomplish so thus it’s not very high up on the list of leverage points.
  • Delays—The lengths of time relative to the rates of system changes. If you shrink the delay you reduce the oscillations and improve responsiveness to changing conditions.
  • Balancing Feedback Loops—The strength of the feedbacks relative to the impacts they are trying to correct can help keep stocks in safe bounds.
  • Reinforcing Feedback Loops—The strength of the gain of driving loops which cause the system to want to go out of control – either positively or negatively.
  • Information Flows—The structure of who does and does not have access to information can reduce malfunctions of human and non-human systems.
  • Rules—Incentives, punishments, constraints have a great deal of power over systems. Subtly changing a rule can dramatically change how a system operates.
  • Self-Organization—The power to add, change, or evolve system structure is a powerful way of introducing resilience into the system.
  • Goals—The purpose or function of the system – and the framing of the goal – are powerful motivators to systems.
  • Paradigms—The mind-set out of which the system—its goals, structure, rules, delays, parameters—arises. This is even more deep than goals and is the source of systems.
  • Transcending Paradigms – By transcending paradigms you can see individual systems for their limitations and put different systems together to get richer results

Back to the Beginning

It’s incredibly unfortunate that Donella Meadows passed away before completing Thinking in Systems – but very fortunate that her work was eventually published. The more that we understand about systems the more we can understand about the organizations we work in, the communities that we live in, and the world as a whole.

1 Comment

  1. Stay with this guys, you’re henlpig a lot of people.


Add a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share this: