莤çĽćłé˛ä¸č˝Żäťść˝čąĄçäşşçąťĺć
Source: Thoughtworks
Reclaiming cognitive sovereignty
Thereâs one particular history of software development that can be told as a story of increasing levels of abstraction. From assembly code to high-level languages through to low and no code, and, today, natural language prompting with AI, thereâs been an ongoing attempt to simplify the process of writing software.
While this trajectory has been driven by commercial imperatives and a need to make the field more accessible to a wider workforce of technology professionals, thereâs also a cost. Iâm referring, of course, to what Joel Spolsky described as the law of leaky abstractions: the fact that the any attempt to encapsulate or hide complexity inevitably leads to a loss of control of some level of detailâââthis detail will, at some point or other, leak.
However, the law of leaky abstractions is missing something: it places the focus on technology and tools; it doesnât speak to the human consequences of abstraction. By this, Iâm thinking of what Iâd like to call âcognitive leakageâ. In short, when abstractions shield us from complexity, we donât have to understand and grapple with it. This means we miss out on understanding or learning something that might well come to be important in solving a problem.
True, we rarely need to know everything to accomplish a task. But if we rely unthinkingly on abstractions we give up controlâââwhat Iâd like to call cognitive sovereigntyâââover the technologies we use. Thatâs a risky place to be. It has the potential to hinder both our personal development and our collective ability to tackle tough and complicated problems in the future.
Undigested complexity
Itâs important we appreciate there can be practical consequences to this that go beyond personal development. Technical debt, for instance, is often tied up with the challenges of different levels of abstraction. Rarely is it just a question of an older technology simply needing to be updated. (If it was that easy we surely wouldnât talk about it so much!) Often itâs the complexity of abstractions that are starting to leak as a system or its context evolves.
Whatâs more, anyone who has worked on these kinds of projects will know that system complexity is very much a cognitive issue. Itâs often not the process of change thatâs challenging but instead understanding whatâs actually happening in the first place.
When faced with these challenges there are, broadly speaking, two options:
- A cognitive shift left: During the development phase, we face complexity head-on (writing code, writing tests, DDD modeling). This takes considerable time and effort but we arguably amortize the cost of understanding.
- A cognitive shift right: We can move quickly over system details with the help of highly encapsulated tools. These allow us to deliver quickly and cut costs upfrontâââyet complexity remains.
I like to see this as a hidden variable in the âiron triangleâ of cost, speed and scope. Weâre not dealing with a square exactly, but we do run the risk of the triangle going out of shape.
Think of it this way: when time and resources are fixed and we force an increase in delivery speed by introducing new abstractions in the form of black-box tools, what we save isnât the volume of code, but really the cognitive volumeâââthe burden of complexity we havenât properly digested. In turn, undigested complexity accumulates in the system.
There are a number of ways it could manifest itselfâââperhaps as bugs during testing or bizarre glitches in the production system. Over time, it may become a fully-fledged black box system no one understands and refuses to touchâââa big architectural ball of mud with a significant cognitive repurchase cost no one really wants to pay.
Refactoring and cognitive repurchase
Hereâs a classic engineering puzzle: why do teams facing a messy, ball of mud architecture vacillate between rewrite and refactor, yet fail at both? Either the refactoring plan fizzles out, or the rewritten system quickly degenerates into another ball of mud.
This can be attributed to system entropy or increased business complexityâââwhich is accelerated by cognitive leakage. While itâs true that all systems tend towards rot and decay without thoughtful and proactive maintenance, cognitive leakage ensures that process is faster than we might ordinarily expect.
At an organizational level this is clearly problematic. As a system scales with further layers of abstraction added to the palimpsest of software, leaks and the attendant complexity and cognitive debt require further labor. Often managers will try and tackle the issue by adding people or implementing new KPIs.
This is, however, risky:
- When you add more people, youâre not immediately tackling existing complexity, youâre adding to it. Newcomers (however experienced they may be) wonât have established the necessary cognitive meta-modelsâââthis requires communication, training and may dilute the teamâs average cognitive density.
- New KPIs may encourage certain activities or behaviors but it can also lead to teams embracing abstraction further as they seek cover up problemsâââeven if piercing those existing abstractions are what really needs to be done.
While itâs easy to see why short-term fixes are attractive, they stand in the way of organizations and developers taking a more deliberate and intentional approach to evolving their systems. It also creates a kind of collective amnesia where expertise about how and why software was produced in the way it was is completely lost.
The cost of convenience
At this point an organization will have to choose between rewriting or refactoring. The problem is that such work involves far more than just typing code: it requires recovering lost requirements, logic and context.
This is the cost of cognitive leakage we need to repay. We spend time repairing logic that broke as a result of our comfort with an abstracted (some might say superficial) level of understanding; we have to rebuild test safety nets that atrophied due to over-encapsulation. This challenging process is the necessary act of repairing cognitive leakage.
This isnât to say abstractions have no place in our work. That would be ridiculous and wrong. Indeed, all software as we know it today operates at some level of abstraction. Take high-level programming languages, for example: C++ and Java abstracted hardware but still require us to critically engage with the technical challenges weâre faced with. Indeed, in some scenariosâââsmall-scale systems, prototypesâââeven low code and AI assistance can be valuable.
However, I would suggest thereâs a qualitative difference between high-level languages and no code and AI approaches. No code and AI allows us to bypass critical engagement completely. They simulate work, tricking us that weâre solving problems when the problems remain untouched beneath the attractive sheen of whatever weâve delivered.
In turn, this leads to a phenomena I describe as software inflation and zombie systems. When complexity hits, developers either have to reclaim cognitive sovereignty or risk the system moving into a kind of living death, with problems that they canât decompose.
In short, convenience always has a cost. Whatâs more, this cost is particularly heavy when applied to the strategic core. High business complexity, long lifecycles and cross-team collaboration make the leakiness of abstractions not only more problematic but also more likely.
Fortunately there are already things we can do to guard against this, starting with good governance. When using AI, for example, we can enforce mandatory code reviews to ensure cognitive synchronization; if using low-code, we can ensure the generated logic remains within the range of our cognitive control.
Cognitive sovereignty is a weapon: Use it
Every time you copy and paste code and every time you choose to use a black box tool, ask yourself one question: whatâs the cost of convenience? This isnât to say you should be dogmaticâââfar from it; it might well be the case that the cost of convenience is perfectly reasonable.
But it wonât always be. And however much product marketers try to convince you that the one simple trick theyâre selling will solve your problem with no downsides, debt or leakage, remember there is always some cost. Thatâs why maintaining cognitive sovereignty is essential: in a highly abstracted world and one of increasing AI-assistance itâs a strength, not an inflexible weakness.
Ultimately, itâs a critical weapon that can help you and your colleagues fight system decay and sustain product vitality.