Systems Thinking Chapter 2: Crafting Conceptual Integrity

Chapter 2 opens with a quote from Fred Brooks: “Conceptual integrity is the most important consideration in system design.” Written decades ago. Still true. Maybe more true now than ever.

What Is Conceptual Integrity

Diana says our ideas design our systems. Everything running in production represents concepts that people prioritized, communicated, structured, and turned into code. If you want to change what runs in production, you first need to change the way you think about it.

This hit me hard. I spent years trying to fix systems by changing code. But the code was just a reflection of how people thought about the problem. Wrong mental model, wrong code. Every time.

Conceptual integrity means the parts of a system are in good relationship with each other. When ideas are cohesive, when they share healthy patterns and principles, when code changes actually improve the system’s ability to serve its purpose. That is conceptual integrity.

When it is missing, you get the classics. Data silos everywhere. That one Python script someone wrote 10 years ago that holds everything together. Teams that openly distrust each other. Seventeen side products duct-taped together, three of them doing the exact same thing. Technical debt piling up with no plan to pay it down.

I have seen all of these. Sometimes all at the same company.

Relationships Produce Effect

Diana quotes Donella Meadows here: you think because you understand “one” you must understand “two” because one and one make two. But you forget that you must also understand “and.”

Software becomes a system when parts together achieve something that could not exist without the “together” part. Two services might each work great individually. But the relationship between them creates a bottleneck. The relationship itself produces an effect that neither part produces alone.

This is where linear thinking fails us. We break things into parts, analyze each part, fix each part. But the problems live in the relationships between parts. Interdependence, information sharing, patterns. These shape system dynamics as much as the parts themselves.

Diana uses a neighborhood as example. You can map property boundaries, count houses and people, label the police station and school. But that map will not help you understand the neighborhood. It will not help you predict how it changes when a factory closes or when gentrification starts. Events transform a neighborhood because of the relationships among elements. Same with software.

Systems Are Sociotechnical

This section contains maybe the most important idea in the chapter. Software systems are sociotechnical. Our thinking, behaviors, and communication patterns are inseparable from the software we produce. You cannot improve the technology system without improving the people system.

Conway’s Law shows up again. The organization’s communication structure becomes the system’s architecture. But Diana goes deeper. Linear thinking does not just describe how we think about code. It shapes what we expect from people too. We expect them to be predictable, rational, repeatable. Like machines.

She brings up the “carboat” example that made me laugh. One group wants a car. Another wants a boat. Nobody resolves the conflict at the systems level. Engineers get told to build a carboat. Everyone hates it. Nobody wanted a carboat.

I have seen so many carboats in my career. Two departments pushing conflicting requirements. No process for reconciling them into a coherent design. Just a battle of wills that everybody loses.

The Theranos example from “Bad Blood” is sobering. Communication only flows down the hierarchy. Engineering reality disconnected from leadership demands. PowerPoint presentations disconnected from reality. Performance measured by hours worked, not value produced. Most of us have not experienced anything that extreme. But familiar daily norms in tech work, things we consider reasonable, can cause real harm at scale.

Counterintuitiveness

This might be my favorite section. When you find a leverage point in a system, a place where a small change can produce big results, hardly anybody will believe you.

Why? Because the best answer to a systems challenge rarely matches our existing intuition. Changes that “make sense” to us match what we already think. A real leverage point works against what we know. The natural reaction is doubt.

Brooks gave us the most famous example: adding manpower to a late software project makes it later. Counterintuitive. True. Still ignored by managers everywhere.

Jay Forrester described going to companies that already know where their leverage point is. Everyone is pushing hard. In the wrong direction.

Diana says counterintuitiveness is not a bad thing. It is inescapable. We always have blind spots. Systems thinking is proactively looking for them.

Systems Are Always in Flux

Software is designed for particular circumstances, at a particular time, for a particular purpose. But circumstances change. The monolith fit the world it was born into. Microservices are needed for a new world. What changed?

Diana uses Donella Meadows’ simple system model. A state box with inflows and outflows. You identify a discrepancy between what you want and what you have. You change the inflows until the discrepancy goes to zero. Simple enough for one service.

But when you have multiple services in relationship, the state of the system becomes something distributed. Inside the software, in the relationships between software, and in the relationship between the system and the world. If there is a discrepancy, where do you make a change? The answer is “it depends.”

Time is always a factor. We like to imagine it as linear. Gantt charts, milestones, next steps. But real work happens nonlinearly. Fifteen microservices and four teams working on different parts will be inherently asynchronous. Changes happening asynchronously, both intended and unintended.

Diana makes a nice distinction here. We should shift from “managing” to “orchestrating.” People and activities viewed as interdependent, like a symphony. The whole has cohesion even though the parts play in their own time.

Riding on the Front of the Train

The chapter ends with something personal. Diana’s colleague Mark told her: “You are on the front of the train. You look out and see a forest. You say, ‘Look at the trees!’ People riding in other cars say, ‘What are you talking about? That’s a lake!’ You don’t get to be mad at them. They aren’t looking at the trees yet.”

That is systems thinking. You see things others do not see yet. There is no technology you can adopt, tool you can use, or role you can be promoted into that will work universally. No magic bullet. Even if you had one, nobody would believe you.

You cannot know if you are right. But being right is always temporary anyway. Systems thinking is figuring out what “it depends” on. Then communicating new ways of seeing until people can see it too.

My Take

After 20+ years in IT, the carboat metaphor is painfully accurate. I have watched teams build things nobody wanted because the real conflict was never resolved. The ideas were bad not because the engineers were bad. The ideas were bad because the relationships between people, between departments, between concepts were broken.

The counterintuitiveness part resonated deeply. How many times did I see the right person with the right idea get ignored? Because the idea did not match what everyone already believed. And how many times was I that person, pushing in the wrong direction, completely sure I was right?

Systems thinking does not give you answers. It gives you better questions. And the humility to know you might be wrong. That is worth more than any framework or architecture pattern.


Previous: Chapter 1: What Is Systems Thinking?

Next: Chapter 3: Shifting Your Perspective

About

About BookGrill.net

BookGrill.net is a technology book review site for developers, engineers, and anyone who builds things with code. We cover books on software engineering, AI and machine learning, cybersecurity, systems design, and the culture of technology.

Know More