Systems Thinking Chapter 3: Shifting Your Perspective
Chapter 3 opens with a quote from Donald Berwick: “Every system is perfectly designed to get the results it gets.” Read that again. If your system produces bad results, it was designed to produce bad results. Maybe not on purpose. But the design got you there.
This chapter is about perspective. Your perspective is not the full picture. It never is. You think you see everything, but you only see what your experience, your biases, and your expertise let you see.
The Elephant in the Room (Literally)
Diana brings up the classic parable about blindfolded people touching an elephant. One person touches the trunk and says “it’s a snake.” Another touches the leg and says “it’s a tree.” Each person is confident they know what it is. And each person is wrong.
This is exactly what happens with software systems. A backend engineer looks at a problem and sees database queries. A UX designer looks at the same problem and sees user friction. A product manager sees missed business metrics. Everyone is touching the same elephant. Nobody sees the whole thing.
The point is not that your perspective is wrong. It’s that it’s incomplete. You have blind spots. And the tricky part about blind spots, you can’t see them because they’re blind spots. That’s why thinking together with other people matters. They see what you miss.
Five Core Practices
Diana lists five practices that are foundational for systems thinking. None of them are surprising, but all of them are easy to neglect.
1. You are constantly learning. Not just consuming courses or tutorials. Real learning means asking yourself “what don’t I know that would be good to know?” and then figuring out how to learn it. Hands-on, from books, by teaching others. She recommends picking something that excites you, not something you feel obligated to learn. Take notes. Follow ideas. Keep it light.
2. You structure inquiry. You figure out how to figure things out. When you face a new problem, you don’t just stare at it. You make a list of questions. You talk to people with different perspectives. You build prototypes. You create a strategy for understanding.
3. You do deep work. Distraction-free time for actual thinking. Not meetings, not Slack, not email. Focused time where you craft concepts. If your calendar is full of meetings, she says make space for contemplation. Cal Newport’s book “Deep Work” gets a recommendation here.
4. You respectfully engage. When someone shares an idea, your first job is to understand it, not to shoot it down. Diana points out how often our first reaction is “No.” Tech discussions, code reviews, RFCs. So much negating and dismissing. Even when you’re right about the technical point, being dismissive makes the communication process harder. Thinking together is not the same as posting your opinion.
5. You stop being Sisyphus. In Greek mythology, Sisyphus pushes a boulder up a hill forever. It keeps rolling back down. Some people in tech play “glue roles,” holding together teams and software parts that don’t hold themselves together. Cat herders. Diana says the goal is to change patterns, not enable them. If people refuse to think well together, that’s not a gluing problem, that’s a behavior problem.
Modeling: Making Thinking Visible
This is the big one in Chapter 3. Modeling is not about producing pretty diagrams. It’s about making the concepts in your head visible so others can see them, challenge them, and add to them.
Diana makes an important distinction here. Most organizations confuse modeling with “producing a diagram.” You think C4 models, infrastructure diagrams, deployment charts. Those are useful, but that’s not what she means.
Modeling is an action. It’s the process of thinking about a system visually. You explore ideas, gain insight, communicate. You can model with words, pictures, code, or all three. You can do it alone or with others. The key point: models make your thinking visible. They don’t represent reality. All models are incomplete and will become obsolete when you learn more.
Software teams often dismiss modeling as “big up-front design.” They say “the code is my model.” Diana calls this a misunderstanding. Modeling doesn’t replace code. It creates conceptual integrity that supports your confidence when coding.
The Iceberg Model
Here’s the practical tool of the chapter. The Iceberg Model. Most of the time we only see what’s on the surface, the events. A bug in production. A late project. A crashed service. We fix it, patch it, move on.
But the Iceberg Model pushes you to look at four levels:
Events (the tip). What happened? What was visible?
Patterns and Trends (below surface). Has this happened before? When? Under what circumstances?
Structures (deeper). What organizational rules, rituals, or processes support these patterns?
Mental Models (the bottom). What do we believe that gives rise to those structures?
Diana gives a great example. Two teams refuse to collaborate. The surface-level fix would be adding more management oversight. But if you go deeper with the Iceberg Model, you find that the organization’s hiring practices never tested for “how well do you think with others.” They only tested knowledge stock (what do you know right now) not knowledge flow (how well do you learn and adapt). The mental model underneath: experience with a specific technology tool predicts quality of future work. That belief shaped the hiring process, which shaped the teams, which created the conflict.
She adds a personal observation that I really liked. Some of the best engineers she worked with had failed whiteboard tests. Three times she saw organizations almost pass on candidates, hire them anyway, and six months later those people were the most valuable engineers on the team.
The Beer Game and the Blame Problem
Peter Senge’s Beer Game is a classic systems thinking exercise from MIT. Simple setup: four teams in a beer supply chain (Retailer, Wholesaler, Distributor, Brewery). Taylor Swift drinks a cranberry craft beer at the Super Bowl. Demand spikes. Teams have to order beer each week.
Millions of people have played this game. Most of them lose.
Why? Because each team reacts to events without seeing the whole system. The retailer orders more beer. The brewery needs four weeks to make it. The retailer doesn’t know this, so they order even more. Every team along the chain over-orders. Then too much beer flows back and nobody wants it anymore.
The structure of the game (hierarchical, linear, no shared information) makes this almost inevitable. Teams play as four separate parts instead of one system.
Here is the part that really sticks with me. When the game ends and people are asked what went wrong, they don’t blame the system structure. They blame each other.
Diana says she’s not blaming blamers. We’re all terrible at thinking in systems. She teaches this stuff and still catches herself blaming the wrong things. But if you want to build resilient systems, you have to stop reacting and start looking at the structures and mental models underneath.
The Shift Is Hard
The chapter ends honestly. Shifting perspective is difficult. You’re not just tweaking your thinking. You’re detangling thoughts, experiences, mental models, feelings, and communication patterns. It’s deep work.
But Diana says it matters. And we’re not alone in doing it.
I think this chapter is the one where the book stops being theoretical and starts giving you actual tools. The Iceberg Model alone is worth the price of admission. Next time you’re in a meeting where everyone is pointing fingers, try drawing an iceberg. Ask what’s underneath the surface. You might be surprised what you find.
Previous: Chapter 2: Crafting Conceptual Integrity