The Database Doctor
Musing about Databases

Coupling, Complexity, and Coding

Why is the IT industry obsessed with decoupling?

Does breaking systems into smaller parts you can understand individually really make them easier to manage and scale?

Today, we explore the pitfalls of this obsession and draw lessons from nature and other fields of engineering. As always, this will be a controversial take - if it wasn't, why would you bother reading this instead of some LLM generated nonsense on LinkedIn?

Our IT Industry’s Obsession with Decoupling

With the exception of a few master programmers out there, our industry operates on a fundamental assumption: "decoupling is always good". Whether it’s microservices, unit testing, ORMs or security models, the prevailing belief is that breaking things into smaller, independent parts will make systems easier to build, manage and scale. This assumption permeates everything from software architecture to project management methodologies like Agile.

The siren's song goes something like this:

Decoupling doesn't Stop Emergent Behavior

Real-world systems, whether in IT, nature, or human governance, exhibit emergent behavior: Patterns and outcomes arising from interactions between components that are hard to predict by looking at each component in isolation. This seems to be a fundamental property of systems that have non trivial complexity (i.e. all systems worth making).

The allure of decoupling is that this emergent complexity can be reduced, by reducing dependencies between components. However, this isn't really how things play out in real life.

Consider:

By decomposing a system into independent parts, we don’t make complexity disappear. We just move it to the spaces between components, making it harder to trace and debug. The failure of one microservice might not seem catastrophic, but when combined with latency, network failures, and race conditions, the overall system becomes more fragile than a tightly coupled monolith.

Let us look more deeply at some examples.

Agile: Local Optimization at the Cost of System Level Quality

Agile methodologies encourage breaking work into small, independent user stories that can be developed and delivered quickly. While this approach improves local efficiency, it often leads to global incoherence:

If we look at some of the largest, most complex and durable systems in the world, these were not made by agile methods.

The Linux kernel can possibly be incrementally maintained - but it needs an architect and a hard working super-brain like Linus to drive it. It evolves via a deeply coupled, community of contributors using a culture carefully evolved for that purpose.

The IBM mainframe, still in use today, was made by carefully planning the system as a whole - and by continuously refining the design at the system level.

The Internet itself - designed as a military project, one of the least agile environments in the world. It was made to withstand attacks on individual components by giving each subsystem a high degree of autonomy and ability to discover its peers. It is an interwoven set of protocols carefully designed to function at the system level from the very birth of the system.

Unit Testing: Isolating Logic at the Cost of System Understanding

Unit testing and its ugly sibling: Test Driven Development - is a set of ideas that just wont die. The premise is that each function, class, or service should be tested in complete isolation. While this makes individual units reliable, it doesn't account for how they interact within the whole system:

I have touched on this in a previous blog post, but the key points are:

In nature, testing happens at the system level. Organisms (which are orders of magnitude more more complex that any programs we write) don’t test individual cells in isolation; they test survival ... in real environments.

Microservices: Distributing Complexity Instead of Reducing It

The shift from monoliths to microservices was supposed to improve scalability, flexibility, and maintainability. However, in many cases, it simply shifts complexity elsewhere:

Your organisation isn't a Web 2.0 company and the problems you have are not going to be solved by microservices. Apologies to Google and Amazon if you are reading from inside one of those organisations.

Compare all this with the ease of debugging and building a monolith. All components live in the same memory space, or at least the same instance of the operating system. You can literally "see" everything by executing top and attaching a debugger directly to each process. Yes, in principle you could have the same debugging experience with sufficient tooling in a microservice world. But those tools don't exist yet - and maybe never will (sorry Loki, you are still a child in this world).

ORMs: Abstracting SQL at the Cost of Power and Performance

Object-Relational Mappers (ORMs) are designed to decouple application logic from direct database interactions by providing an abstraction layer over SQL. The common excuses are: "I allows me to switch database if I need to" and "I can use OO style logic to access the database"

First, switching databases is an incredibly rare event. Doing so isn't that hard (I have done several such projects in my career). SQL, despite its many variants, is still reasonably standardised (except MySQL, but if you picked that you got bigger problems than an ORM can help you solve). Hacking the ORM to work with a different database is really tricky and not the smooth migration that ORM abusers think it is. Databases have subtleties that the people who write ORM simply don't understand (Because if they did, they wouldn't have written an ORM in the first place). When you replace the database software - the ORM may suddenly change behaviour in subtle ways that would be obvious if you had just ported the SQL.

Second, OO is one way to write code - it isn't the only one. Perhaps (very likely) OO isn't even best way to write code (if there is such a thing). Each problem domain has its own languages, SQL happens to be a language custom made for dealing with complex data. If you have decided to learn an ORM, maybe you are better off learning SQL instead. Broaden your horisons, as they say.

Third, while ORM proponents tell you it will simplify data access - what an ORM really does is to cripple the full power of SQL:

Instead of decoupling you from the database, ORMs introduce a subtle level of coupling that ultimately leads to higher complexity than if you had just used SQL.

"ORM will decouple me from the database" ... is a lie.

Nature doesn't do This!

While our industry (IT) pushes for decoupling at all costs, nature offers a different lesson: Evolution creates systems that are inherently coupled and which rely on deep interconnections to function efficiently. Rather than enforcing strict separations, mother nature balances modularity with integration.

Biological Systems: Coupled for Survival

Ecosystems: Interdependence Creates Useful Macro behaviours

Nature Avoids Over-Decoupling

Nature relies on emergent behaviours to create complex systems. The most sophisticated, biological functions emerge from the interaction of deeply coupled subsystems.

Consider the anthill, a result of a bunch of simple state machines, operating together in a network to form an emergent behaviour.

On the other hand, nature also uses decoupling when it is effective to so do. For example, cells operate mostly independently (which is how evolution got off the ground in the first place - sorry, but not sorry, creationists). Individual "modules" evolved several times because they are useful in themselves without needing coupling to function. For example, the eye has evolved independently in many species.

The lesson we can take from nature is:

"Decoupling should be used strategically, not as a universal principle"

Other Humans aren't obsessing over de-coupling either

Most traditional fields of engineering embrace coupling as an inherent and necessary aspect of system design. In these fields, components are not designed to be loosely connected and interchangeable, but work together in interdependent, tightly integrated ways. This reduces both complexity and cost.

Structural Engineering: Deeply Integrated Systems create Stability

Bridges, buildings, and road networks rely on strong, interconnected structures rather than loosely coupled components.

While each component has known, structural properties (and we could even say it can be tested in isolation), the system as a whole is designed with coupling in mind.

Mechanical Engineering: Precision from Integration

Mechanical systems are optimized through tight coupling between components to maximize performance, cost and reliability.

Consider cars as an example.

Pistons, crankshafts, and fuel injectors must operate with precision timing even though each component is locally decoupled (in the sense that it can be tested in isolation). Components are being made to exact specification of the entire engine system. Testing at the system level has feedback loops that drive the refinement of each component. If each part was designed in isolation, the system as a whole would be more expensive to build and more likely to fail in unpredictable ways.

Brakes must be carefully designed with the properties of the entire car in mind (weight, tires, potential top speed). The more complex the car and the faster it can go - the higher the coupling becomes between components as weight and functionality is carefully balanced against the desired speed.

On the other hand: Car seats, apart from their weight in a race car - can be designed largely in isolation. They can be replaced fully without affecting the car itself. They are therefore decoupled from the car because it makes sense to do so. But, what we appreciate about a luxury car is exactly the way each component seems to fit into the whole - including the aesthetics of the interior (Tesla owners: you wouldn't understand this point - but trust me on this if you buy German one day).

Law: Interwoven by Design!

Legal systems embrace coupling as a fundamental necessity for coherence, enforceability and governance.

Laws, regulations, and judicial precedents are not isolated rules but parts of a deeply interconnected structure that maintains order and consistency (sorry Trump voters who still think the Great Orange Leader is not crazy: you wouldn't understand this point - you can skip reading now).

A Web of Interdependent Rules

Legal statutes do not exist in isolation. A single law often references multiple others, ensuring that new provisions align with existing frameworks. Laws are complex networks of dependencies that must be considered together to understand their full implications. There can even be contradictions between these frameworks - but that does not prevent the system from functioning at the macro level.

Government branches (executive, legislative, judicial) are coupled through constitutional mechanisms, preventing any single entity from gaining unchecked power. The system is explicitly designed to be interdependent and not have a strict hierarchy of "power modules".

What can we learn from other fields of human behaviour?

Coupling is the norm in nearly every other field of engineering and human effort. Instead of avoiding coupling - because it is too hard to grasp - it should be embraced and managed. Coupling leads to cheaper, more reliable and more effective systems level designs.

Coupling is also how we govern our species and run our economies at scale. Despite countless wars and attempts at destroying each other, we are still here. Something about coupling works!

Why are we not doing this in IT?

It wouldn't be me, if I didn't have a theory...

Why is IT obsessed with decoupling? The Low Barrier to Entry

Unlike other fields of engineering, software development has an unusually low barrier to entry. Becoming a mechanical engineer, civil engineer or a solicitor requires years of formal education, certifications, and real world experience. In contrast, many software developers enter the field through self-learning, coding boot camps, or without formal training at all. The money you can make in our industry attracts a lot of bullshit artists and charlatans.

Because of this, we live in an industry where:

The outcome of all these factors is a generation of developers who are not trained in the principles of system design. "Engineers" who only know how to optimise for solving local problems. Decoupling is a local optimisation that is easy to understand, whereas the principles of system design with coupled components takes experience to grasp.

Did I mention that LLMs will make this worse?

Database Architecture: An Example of Managed Coupling

Databases can often gain significant benefits by carefully choosing when to couple and when not to.

The optimiser benefits from being de-coupled from the rest of the database (example: GPORCA). This allows us to test the optimiser in isolation and to swap out the optimiser for a different, experimental one. Here, we apply de-coupling, because it isolates the interface of a very complex component (the optimiser) and prevents the complexity of that component from bleeding into other components. But, we often find that some queries are "trivial" and do not need the optimiser, or need a shortcut through the optimiser (which is an expensive code path). For example SELECT statements without joins and singleton INSERT statements. Instead of letting the optimiser build an entire query graph of a trivial query, we can build such a graph directly in the parser the moment we realise that the query we are currently parsing is of that type. For OLTP systems, such optimisations can lead to significant performance improvements (10-100x) and allows us to apply optimisations to a very specific code path highly coupled with the parser and optimiser.

We can often gain significant performance improvements and simplify the architecture, by coupling a code generator with the Execution Engine - injecting generated code directly into the pipeline of execution instead of letting the execution engine interpret each node in the query graph.

It also desirable to have a tightly coupled query lifetime component. This aids in debugging and managing complexity from the perspective of the user. The database should always be able to answer the question: "Where is the query right now and what is it doing?". Such a lifetime component must be aware of nearly every other component in the system - and every other component will need a way to send information to this mechanism. In a way, the query lifetime component it the central nervous system of the database - tightly coupled with every other component. It also means that the query itself becomes highly coupled with the entire stack.

Summary: When to Decouple and when to Couple

Our industry needs to rethink our obsession with decoupling. Like Patterns and "Best practises" - we are blindly applying the "decoupling is always good" principle locally, thinking it will lead to better system level design globally. The result is overly complex, expensive and fragile designs with a great deal of technical debt and code duplication.

Instead of automatically asking “How do we break this apart into smaller components I can understand?”, we should ask “How do we manage complexity so I can understand the system itself?”. De-coupling should only be introduced if the resulting system level design is less complex than a coupled design. Often, there exists a coupled design which is more maintainable, scalable and less complex than a decoupled design. As we can see in other fields of engineering and in nature itself - coupling is the norm, not the exception. This insight should humble us and make us question our a-priori decoupling strategies.

At this point, you might feel that I insulted your political views or that I made you feel angry. You might think I am an idiot (you're welcome - you would not be the first to say that). Now might be a good time to stop reading, before I further insult your sensitive feelings...

.... Oh... You are still here? Alright then...

Stretching the Idea: Coupling as a stabilising effect on civilisation

Coupling can, in some ways, be seen as the antithesis of hierarchy. It's anti-lobster mode!

Hierarchy manages complexity by breaking it into smaller, easily understandable parts which talk to each other using simple rules. Hierarchy builds behaviour one step at a time, by clearly placing responsibilities on each component. Unfortunately, the outcome is often a system that is brittle and prone to cascading, systemic failures. It also results in systems that are often incomprehensible to each actor in the system (because knowledge is isolated). When designing hierarchies, we place an enormous amount of trust in "architects" (or politicians, or CEOs) being able to manage the resulting complexity. As should be clear from anyone who has lived on the planet and paid attention, this trust is nearly always misplaced.

Coupling manages complexity by allowing parts to interact with each other in complex ways governed by rules (or laws of nature) that result in desirable, emergent behaviours. It relies on each component understanding a larger part of the system than in a hierarchy. The complexity reduction of coupling comes about by allowing each component to be more resilient against changes in the system as a whole - because the component has enough "knowledge" to build this resilience.

The analogy to political systems should be clear. One of the greatest achievements of civilisation is the creation of a form of government (democracy) where each "component" (the individual) has a great deal of autonomy and resilience. Pretty much everything we today call "modern civilisation" has been built by civilisations that operate as highly coupled systems. From the iPhone in your pocket, the crop yield of your local farms, the fridge that keeps that food from rotting, the car you drive, the roads you drive on, the tools you use for building a house, the plane you use to fly to holiday, the straw (plastic or paper) in your drink on the beach, the sunscreen preventing you from getting cancer when you tan, the cancer treatment in case you get it anyway. These all came out of systems that run the "democratic operating systems" of autonomous actors. Coupling works! It creates useful things that make humans happy.

The idea of trusting each actor with independent decision making and power stands in stark contrast to the idea of hierarchical control. Coupled systems are more resilient to change and ultimately, more stable than hierarchical systems. The collapse of the Soviet Union is a good example of what happens when a hierarchical system (the Soviets) stands face to face with a coupled system that can adapt. The establishment of highly coupled organisations (ex: The UN, the EU, WHO) are examples of how we can introduce coupling into human governance to construct systems which, while slow moving, are able to stabilise the world we live in and adapt to technological change in a responsible manner. This slows things down, but it avoids the chaos which ensued when the world was controlled by power hierarchies competing with each other. Coupling comes with its own complexity: The protocols and ground rules needed for actors to communicate effectively. I personally feel we have failed at adapting those protocols to the modern, globalist world. That does not make the system itself broken and we shouldn't remove coupling - that would just result in the same amateur systems we get from blindly applying decoupling. But perhaps we need a few software updates to make each component more resilient.

Coupled systems take more effort the understand. They require a deeper appreciation of the system as a whole. They require you to engage with the reasons things are like they are. Taking a sledgehammer to a highly coupled system and replacing it with the will of a few individuals who value hierarchy, is playing with toys neither you, nor anyone else, understand. If you are playing with those toys, you are betting with not just your own future, but that of your children.