For more content like this subscribe to the ShiftMag newsletter.
No one says, “This code is a “big ball of mud,” but adding new features is quick and easy.”
But still, too often, I see developers attempt to be faster by skipping refactorings, tests, and other practices that help create good-quality code. They just start adding new lines, but the change soon becomes too complex. Since they are developers, they will make it work in the end.
Sometimes, they may be even faster making that one change. However, the design will be harder to understand, implementation complexity will be higher (high coupling, low cohesion, etc.), and testability will be lower, making every following change harder and slower.
After a few times cutting corners like this, the code will become too complex to understand , and every subsequent change, no matter how little, will be risky.
The snowball effect of (losing) quality
Even worse, when the next developer needs to work on a new feature in that codebase, it will be easier to continue working with little attention to quality because it’s evident in the codebase that quality is not a priority (I think it would be evident that development speed is also not a priority).
With such a cycle, in a relatively short time, any codebase can become complex, risky, and expensive to change.
There is also a problem with what software quality means and how to measure it. I like the quote from “2022 Accelerate State of DevOps Report“:
The ease with which an organization can safely and confidently change its software is a marker of the software’s quality.
David Farley
Creating quality software takes time, usually more than taking shortcuts during development, but it will enable future changes and speed. And it’s not always easy, as Kent Beck warns us: “For each desired change, make the change easy (warning: this may be hard), then make the easy change”
Might be interesting to note that Beck gives a lot of practical advice and guides on when and how to improve your code in his excellent book: “Tidy First?” – and there are many other books on creating quality code and software.
Quality is the prerequisite for speed
There is no lack of people building on the myth that there is a trade-off between quality and speed. It’s just the opposite: quality leads to speed.
In his video “How To Build Quality Software Fast“, Dave Farley explains why there is “No Trade-Off Between Speed & Quality”, noting that cutting corners leads to slower, not faster, development.
Adam Tornhill shows data that confirm how a good-quality codebase improves development speed in his presentation “The Code Quality Advantage: How Empirical Data Shatters the Speed vs. Quality Myth“. He also shows when to invest in better quality and when it may not be needed (for example, in codebases that are rarely changed).
Even the State of DevOps reports that “teams who score well on speed consistently score well on quality” – and they report that every year.
Hurry… in small steps
In my experience, the best way to achieve both speed and quality in software development is to do pair programming using test-driven and trunk-based development with a great deployment pipeline.
That combination makes you work in small (tiny) steps and ensures fast feedback from your programming partner, tests, and deployment pipeline – meaning you’ll quickly correct and adapt to any design and coding error. By breaking up complex problems into smaller and simpler ones, we can implement them faster because we are working in small steps that are easy to understand, implement, and verify.
What is your experience with speed and software quality? Is there a trade-off? Please share your experience with me on Twitter or LinkedIn!
The post The dilemma of quality versus speed is false appeared first on ShiftMag.