Systems Thinking Chapter 12: Redefining Success - Part 1
We are near the end of the book. Chapter 12 is called “Redefining Success” and it asks two questions that sound simple but are not. How do you know you are learning systems thinking? And what does success even mean when you look at the whole system?
This is Part 1 of 2 for this chapter. There is a lot here, so I split it up.
Success Is Not a Number
Diana opens with a James Clear quote. “We do not rise to the level of our goals. We fall to the level of our systems.” This is the setup for the whole chapter. Success is not one metric. It is a system of criteria. And if you measure only one thing, you are being reductionist. Which is exactly what systems thinking warns against.
Her example is McDonald’s. 13 billion dollars gross profit per year. But it also damages human bodies, burdens healthcare systems, generates three tons of packaging waste every minute, and drives deforestation for cattle grazing. Is that success? Or is that one metric looking good while everything else burns?
She compares it to tobacco. Massively profitable until society looked at the full systemic impact. Will we look at fast food the same way someday?
Diana is not anti-profit. She likes gluten-free cake and does not pretend to live on organic air. But success means keeping measurements and impact interrelated. Seeing the whole picture, not just the bottom line.
Enabling Constraints
This is a concept I really liked. Successful systems have enabling constraints. Limits on growth that let the system scale while containing the impact of that scaling. Constraints slow things down so you can observe what is happening and adapt.
Think about it in software terms. You could ship features as fast as possible. But if you don’t have constraints like code reviews, testing, observability, your system grows out of control. The growth serves nobody. Enabling constraints are the guardrails that make sustainable growth possible.
Diana puts it this way: success is not measured by how well we dominate a system, but by how well we thrive in it. That is a big mental shift for people raised in competitive, winner-take-all cultures.
Solving Root Causes, Not Symptoms
Here is a striking statistic. 1 in 100 US citizens is in prison. Highest rate in the world. Russia is second. Diana asks: is this success? Is it good law enforcement? Failed prevention? A profitable prison industry? All of the above?
The answer is “some combination of many factors.” But the deeper point is about intervention dependence. Instead of solving root causes, we keep applying fixes and band-aids. More control, more lids on problems. Never looking underneath.
She ties this back to the Iceberg Model from earlier chapters. Events are on top. Below that are patterns, structures, and mental models. If you only deal with events, you get intervention dependence. You keep reacting to the same problems, again and again.
In software, this is the team that keeps firefighting production issues but never fixes the architecture that causes them. Sound familiar? I have been on those teams. It is exhausting.
Equalizing Impact
Diana asks a sharp question. “Success for whom?” If success benefits one group but not others, that is not systemic success. Successful systems equalize impact.
She uses the prison numbers again. 30% of prisoners have white skin vs 64% outside. 56% identify as Black or Hispanic vs 28% outside. 10% are women vs 50.4% outside. People below the poverty line are far more likely to end up in prison.
These imbalances come from what she calls the “Death by a Thousand Papercuts” problem. Small recurring issues that seem insignificant on their own but scale to massive systemic impact. This is why paying attention to small patterns matters. This is why changing the rules of the system matters, not just reacting to individual events.
Knowledge Flow Is Power
In systems, knowledge is power. Not the political kind of power, the systemic kind. The more knowledge flows in a system, the more likely that system succeeds. Diana calls this transparency in technology cultures.
And here is where hierarchy comes in. In healthy systems, hierarchy exists to coordinate the whole system toward its goals while giving subsystems enough autonomy to do their work. Information flows from bottom up, not top down. But as Diana discussed in Chapter 11 on leadership, many organizations do the exact opposite. Management hoards information, pushes commands down, and wonders why the system does not adapt.
Success Is a Paradigm Shift
Diana gets real about diversity in tech. 91.88% of software engineers worldwide are men, according to a 2022 study. Predominantly white, straight, American or European, under 40. In systems, lack of diversity is rarely natural. The system itself generates its own outcome. We blame the wrong things. And we make the problem worse when we try to fix it, because of counterintuitiveness.
She introduces a pitfall called “success to the successful.” People who succeed in a system benefit from that success in ways that generate more success for them. The problem, as Donella Meadows says, is the structure of the system, not the morals of the people in it. Fixing this means changing the structure, not lecturing individuals.
The Qualities Checklist
Near the end of Part 1, Diana gives us something rare in this book: a checklist. A list of qualities that show you are developing systems thinking skills. I will summarize the ones that hit me hardest:
- You practice thinking as a daily activity. Writing, modeling, coding. And you call these essential technology skills, not “soft skills.”
- You know the difference between linear and systemic thinking. You know when to use each.
- You see people systems and technical systems as inseparable.
- When problems recur, you look for systemic structures and feedback loops instead of blaming leadership or buying a new tool.
- You accept uncertainty as natural. You shift mental models when circumstances change.
- You can describe what you think. When you do not know, you invest in deep work to figure it out.
- You recognize reactive and biased thinking patterns. In yourself and others.
- You create conceptual models and artifacts. You call artifact creation an essential skill.
- You encourage knowledge flow.
This is not a certification exam. It is more like a mirror. Hold it up and see how far you have come.
My Take
This part of Chapter 12 is where the book ties everything together. All the concepts from previous chapters, enabling constraints, Iceberg Model, knowledge flow, feedback loops, mental models, they all show up here as components of what “success” really means.
The McDonald’s and prison examples show how single-metric thinking creates the illusion of success while the system rots. “We shipped 200 features this quarter!” Great. How many created tech debt? How many made the system harder to understand? Nobody tracks that.
The checklist is useful. Not as a pass/fail test, but as a direction. If you are doing even half of those things, you are ahead of most teams I have worked with in 20+ years.
Part 2 will cover MAGO’s journey toward success and practical objectives for systems leaders.
Previous: Chapter 11: Systems Leadership