Antsstyle
3 min readDec 26, 2021

--

You have brought up a very important and valid point - one that I failed to explain. I do not contest that in the past, waterfall was done badly, as you say. However, there are many counterarguments here:

Firstly, it is dogma I am fighting against here. In most places waterfall is seen like a blasphemous word now, never to be uttered in polite company, whilst Agile is practically held up on a pedestal as a gold standard of development (which can be seen rather easily: look through any job vacancies and 'Agile' practically appears as often as the word 'developer' itself). This is in complete disregard to whether it's actually good, or why Waterfall has been bad before, and so forth.

Secondly, I would argue that the fact Waterfall has been done in the manner you describe before is not proof of waterfall being bad. However, this leads to an important question: why can I say that bad project management = Waterfall not done properly, but bad project management in Agile = Agile is bad?

The answer is that the concept of "doing in small iterations/pieces" long predates software. That anyone - software engineer, project manager, or otherwise - implemented it in a "do the whole project in one waterfall" fashion can thus be clearly demonstrated as a bad idea.

There are myriad examples of this in other industries. Architects, long before computers even existed, didn't go building the second floor of a house before they knew if the first floor was stable. Cooks since centuries ago have worked by creating dishes in stages or 'layers' much like buildings, because trying to test everything at the end never worked (you didn't get those onions caramelized early? Too late to do it now, they're mixed into the rest of the dish, you'll never get the water out of them). Obviously these are not perfect comparisons, but they demonstrate the concept well enough.

That being said, it would be unfair to say that *all* older projects were done badly in Waterfall purely out of manager incompetence or similar reasons. Today's CI/CD tools, testing tools and other such things were not as mature back then (e.g. Hudson/Jenkins only came out in 2005, GIT only came out in 2005 though of course SVN and CVS were around before that) - while it was of course still possible to unit test everything, or make a system where you could split projects into small cycles, it wasn't as viable in time and organisation terms as it is now. That made that kind of low granularity in splitting up projects a bit of a nonstarter.

As such, I don't disagree with you. What I am trying to say is that I do not think Waterfall in software is the same thing it once was in implementation terms; ultimately it's not even a 'software management paradigm' as much as a general project strategy for any project, and how viable it is to do that in small iterations depends on the tools available. It's always - in every industry - been vastly preferential to do anything in small, complete cycles of building and testing where it's possible to do so, and it has usually been done that way since long before computers were around.

As it became more viable to write and test code in smaller blocks, Agile turned up - being no more than "waterfall now that we've got the tools to do projects in small pieces in a reasonable timeframe", but with lots of toxicity woven into it.

The conclusion here, I think, is that "traditional Waterfall" and "modern Waterfall" are exactly the same thing. The only difference is how viable it is to split projects into very small pieces that can be done as separate build->test cycles.

In a sense, you could therefore say that the practice of doing projects in small iterations was "traditional Waterfall"'s natural successor. I just think it's unfortunate that it came attached with a lot of other nonsensical and harmful practices it did not need to come attached with, in the form of Agile.

--

--

Responses (1)