Artificial complexity
Know when to use and when to reduce
Today I want to shine some light on what I consider a dead angle of arguments for agility. While there are others who have written about this topic before, I have not heard about it very often.
The case for agility is usually made like this (see Cynefin as one example): Agile approaches are best suited for complex environments. “Complex” means there is more unknown than known. This complexity is a given and cannot be reduced (e.g. through upfront analysis as in “complicated”).
There is definitely some truth in it as what used to work great for mass manufacturing does not work in knowledge work. There are aspects of an environment that bring a natural complexity with them: The product (high tech, usability) and the market (countries with their legislation, currencies with their exchange rates).
Nevertheless, there is also what I call “artificial complexity”: Aspects of an environment that are currently complex but whose complexity is not a given at all. How a company organizes its work is often not the result of smart choices but a compromise of necessities, history and politics. I see this in processes, policies, frameworks, meetings, artifacts, reports, tools, technology layers, outsourcing decisions. This is an especially nasty version of Conway’s Law: Because of the overly complex organizational structure, the resulting product is also more complex than necessary.
The number of necessary documents, clicks and approvals cannot be blamed on Waterfall - because not even Waterfall contained them. The complexity lamented in big organizations is largely self-inflicted and the result of org debt that has not been paid back. And this is where starting with the word “complexity” can completely misguide efforts. Because of the argument that “our environment is complex, and nothing can be done”, this is an invitation to not tackle the important topics.
This is what I consider the flaw of most “scaling” approaches: They accept complexity and present solutions to live with it. They aim too low by not challenging enough the status quo. I see this as one of the dividing lines within the agile community: Manage by adding or simplify by reducing. Adding more process to deal with complexity is a poor choice because complexity escapes simple (or complicated) processes. “Fighting fire with fire” rarely works well.
Plus, my perception is that while being busy implementing heavy structures and processes, some basic truths are forgotten: DevOps or “the cloud” do not disable the rules of good software development. Your system will break where you go cheap.
Adapting a Star Trek quote to express what I think I have read this first from Steve Tendon: “Complexity is the beginning of wisdom, not the end.” The argument is that sometimes, it is possible to bring everything down to relatively simple and easy-to-understand formulas like Throughput Accounting.
This reminds me of a line from another movie (The Karate Kid II): “When you feel your life gets out of focus, always return to the basics of life.” In the same spirit: If it gets too complex, make it simple again.
That is an important argument of flow-based approaches: Forget the org chart. Focus on Flow. I have seen this in Prateek Singh’s Scaling Simplified: A Practitioner’s Guide to Scaling Flow, Jon Smart’s Sooner Safer Happier and Mik Kersten’s Project to Product. Even Objectives & Key Results (OKR) are explicitly not supposed to mirror the existing org chart.
From my studies, I remember an assumption from business economics: We limit our models to things that can be measured in money. The danger lies in looking only at those things. Regarding complexity, I think the same argument can be made for agility.
Let’s not close our eyes to unnecessary complexity. It can be undone.
Paramore: Burning Down the House

