Design patterns are reusable approaches to common problems - anti-patterns are the same, but ineffective (and even counterproductive).
An example of a design pattern is the singleton - we often want to allow access to a resource, without incurring the cost of building it over and over. Computationally, this means restricting the instantiation of a class to a singleton, and then coordinating access to it where needed.
Anti-patterns can include code smells (like the infamous “spaghetti”), but are often less technical. Any process that inhibits effective development is cast as an anti-pattern. Long meetings, analysis paralysis, planning too much and planning too little - all this and more is categorized and enumerated as reasons for poor engineering outcomes.
Generalizing failures from producers to their ecosystem is a good thing. Yet this approach has a blind spot - nontechnical failures can also occur in purely technical contexts. And so I introduce a new engineering anti-pattern - superstition.
“Don’t touch that, it might break!”
“I don’t know why that is there, but it doesn’t hurt. Keep doing it.”
“This syntax is weird, but it works. I guess I’ll just reuse it.”
Even in my initial draft of this post, I fell prey to superstition. I used abstruse “LaTeX quotes” instead of straightforward “quotation marks.”
Shown in code:
``LaTeX quotes'' "quotation marks"
Why did I fall back to old, cumbersome ways? Because with past systems, it was necessary. My current setup doesn’t require it - but it still works, even though a newer approach is now possible.
This is sometimes called “legacy” or “backwards compatibility” - and psychologically, it is an immensely good thing. It doesn’t take that many rotations around the sun to become “legacy”, and any system that is human-facing should be empathic and avoid unnecessary changes. Newer is not necessarily better, and much iteration is simply churn.
But in the technical layer - the code that talks to code, and is written by people who are well-paid to understand it - anachronisms are simply incorrect. If a better approach is developed, it should be utilized. This does not mean that all new things are improved - but it does mean that all improvements should be employed.
When improvements are deferred, the cost compounds. Future developers will have even more to untangle if they want to update the system, which further pushes them to rationalize not doing so. After all, they have sprint goals to meet, and are rewarded for shipping features, not engaging in code archaeology. They rely on frameworks and builtins, but do not have the time to understand them.
This situation isn’t simply technical debt - it’s more of a technical drift. The issues are accumulated, yes, but also varied. Some of the codebase may be relatively healthier, leading to uncertainty on where or what to refactor.
The resulting behavior overall is, as described at the opening, superstition. Even though engineers are tasked to build deterministic systems, when met with these challenges they find themselves indulging in all manners of “magical” thinking.
If you write code, strive to take the time to understand it too. If you don’t, appreciate that those who do aren’t just manipulating pixels - they’re building a system, and that system requires care and maintenance.
Computational systems are not magic - but, if treated poorly, they (like a car that is never maintained) are unreliable. If it is a prototype, for an ephemeral purpose or few users, maybe that’s okay - but if it is a critical system or real product, both the engineers and the users deserve better.