With the help of Jerry Weinbergs description of the size/complexity dynamic in Quality Software Management Vol. 1 – Systems Thinking, I think I found an abstraction to Cockburns reasoning on the failure of Agile adoption.
First of all I was delightened that Weinberg formulated the Generalization of Brooks Law in some degree similar to how Alistair Cockburn made up his mind on it. The basis behind the Generalization of Brooks Law is that by putting more people into the project the software team gets more complex. Weinberg shows that a manager of a software project can influence the team size and the requirements. In addition some randomness gets put in like past experience, etc. On the output of the software project there is of course the product itself and some other outputs like experience with the technology or the methodology.
The above picture is a redraw I made based on Figure 4-2 on page 59 in Quality Software Management Vol. 1 – Systems Thinking. It shows the cybernetic model of a software development system to be controlled.
The size/complexity dynamic – or the Generalization of Brooks Law – then boils down to a curve. Project size on the x-axis, project complexity on another. Weinberg argues that usually this curve is assumed to be linear, while indeed it’s non-linear or exponential. This means “doubled team” results in “four times the complexity”. Unfortunately this does not just hold to the team size. Likewise I can reduce the scope of the delivery and thereby make the project less complex. Personally I think it should be possible to explain the need for Cockburns approach with the Crystal family of methodologies in this way. The criticality is by Weinbergs model part of the randomness factor of the project. Likewise is past experience.
The above picture is a redraw I made based on Figure 10-5 on page 148 in Quality Software Management Vol. 1 – Systems Thinking. It shows two methods for developing software with differing limits. The idea of differing methods can be transported also to team dynamics applied based on past experience with the project, the technology and other outputs filled in into the randomness of the cybernetic model above. Cockburn coined sweet spots in methodologies and called the limits to be off the drop-off of that methodology.
Let me elaborate this a bit more. Putting together Cockburns approach and Weinbergs Size/Complexity dynamic yields a multi-dimensional mathematical space. There are several variables which form axis of this multi-dimensional space. In this space each project, each software and each requirements are one point on the axis. What you do with a project now is to probe one point in that multi-dimensional space. Traditional projects try out and either succeed or fail to deliver the software. Agile teams do the same. The difference is, that Agile teams keep on refining their particular point in this multi-dimensional space to find if they are right where they want to be or not. Over the course of the retrospective they find out if they are, and if not adapt a few things. The reason why the changes are restricted to some small number are to avoid drifting away too far too early in this space and lose control over the course. Agile teams do these refinements on their course regularly and often. For an iteration length of one or two weeks, you get regular course corrections during the travel of your project. Overall the team executes exploratory learning, making small corrections in order to find where they shall end up.
For each methodology in the size/complexity dynamic model there are curves which represent the capability of the methodology. Just like the sweet spots and drop-offs discussion in Agile Software Development – The Cooperative Game, there is a part where one methodlogy is impractical for a given complexity. Complexity here is the combination of requirements and team size plus the randomness like past experience, easy access to expert users, but also the past experience with the methodology itself.
Here comes the reason for failures in agile adoption into play. Shu-level actions are just one part of the story, where the randomness changes in the size/complexity dynamic model. Thereby the problem at hand gets less complex for the same team, and therefore the underlying problem size changes. If the methodology now is not adapted, i.e. by changing the practices and actions that need to be applied, or transitioning to new practices and actions resulting from Ha-level insights, this might blow the whole adoption process itself but by the complexity resulting from the missing transitioning process. The project fails since the team did not adapt their situation to the decreased problem demands. Thus the size/complexity dynamic explains Cockburns Cargo Cult and Shu actions reasoning about failures of Agile adoption.
On the other hand laziness will from my point of view end up in failure independently from the methodology chosen. This is why Software Craftsmanship was called in.