Getting Rid of Technical Debt in Agile Projects

DavidTz40 - Dec 29 '22 - - Dev Community

Technical debt was originally defined as code restructuring, but in today’s fast-paced software delivery environment, it has evolved. Technical debt may be anything that the software development team puts off for later, such as ineffective code, unfixed defects, lacking unit tests, excessive manual tests, or missing automated tests. And, like financial debt, it is challenging to pay back.

Just as you should not take out a loan without a strategy to repay it, you should likewise have a plan in place when accumulating technical debt, whether intentional or unintentional. The most critical aspect of this approach is transparency — adequate tracking and visibility of the accumulated debt. Armed with this information, the organization can design a strategy for when and how to “pay off” technical debt. This is known as debt payment.

Making a Strategy to Pay Off Technical Debt

Assume a development team working on a new project began by adhering to a particular programming standard. They even put up an automated tool to run the code on a regular basis and report on compliance with these standards. However, the development teams became overburdened and stopped using this program after a sprint or two, and when the project director requested a report after a few months, there were numerous problems and warnings, all of which must now be addressed.

Agile teams who are committed to delivering the most amount of customer value feasible throughout each sprint frequently encounter this situation. Despite having all the functions in place, the issue has to be resolved right now since the team does not want to deploy code that does not meet production standards. The team is then confronted with a few debt-service choices available:

  • Divide all defects and warnings among the development team and delegate corrections to them during the following sprint, in addition to their usual development work, by scheduling additional hours.

  • Estimate the number of refactoring stories and either schedule them as new user stories for forthcoming sprints or incorporate them into existing user stories.

  • Negotiate the number of user stories intended for the following sprint with the product owner in order to have some more time for reworking the code.

  • Devote an entire sprint to refactoring code.

  • Plan to distribute this effort across several sprints and set a deadline for this assessment before the release’s completion.

Though all of these are legitimate options, the ideal strategy is determined by the team, the context, forthcoming deadlines, the level of risk the team is ready to take, the top priority for functionality that must be provided, and engagement with the product owner. Again, just as you would when taking out a financial debt, you should plan to pay off technical debt as soon as possible using the resources you have. It’s a good idea to conduct a risk assessment of the scenario and come to an agreement with the team on the best course of action.

Try an online Selenium Automation Grid to run your browser testing scripts. Our cloud infrastructure has 3000+ desktop & mobile environments. Try for free.

Technical Debt in Software Testing

Technical debt is not limited to programming. Unsatisfactory testing of user stories, letting regression tests stack up for later sprints, not automating essential tests, not having complete test scenarios written, not cleaning up test environments prior to the next new version, and not developing or testing with all test data combinations on current features are all likely to incur technical debts over time. Occasionally debt is accumulated on purpose for the short term, such as failing to update tests with new test data on the final day of the sprint owing to time pressure, but intending to do it within the first couple of days of the next sprint. It’s fine to delay certain technical debt for a short time as long as the team approves.

Sometimes debt may be purposefully incurred for a longer period of time by planning ahead. For example, you can decide to delay any system nonfunctional testing, such as scale, load, or security tests, until a few sprints have passed and features are stable enough to conduct the tests. Again, deferring certain activities is acceptable as long as the team acknowledges the risk and has a strategy to address it. Testing technical debt can allow us to escape difficult situations when necessary, however, you must still take care to prepare meticulously, keep track of the debt, explain it openly and regularly, and pay it off as quickly as possible. Having a strategy in place to pay off these debts eases your strain over time and ensures that your software keeps up its caliber.

Do you want to run your browser automation testing scripts? Try online Selenium Testing Tool Grid infrastructure that has 3000+ desktop & mobile environments. Try for free!

Better To Prevent Than to Cure

“It is better to stop something bad from happening than it is to deal with it after it has happened”, as the old saying goes. To avoid technological debt, each team must develop its own approach, but a general best practice is to have a definition of “done” in place for all activities, user stories, and tasks, including completing essential testing activities. A definition of “done” establishes a shared understanding of what it means to be done, ensuring that everyone participating in the project means the same thing when they declare it’s done. It becomes a representation of the team’s quality standards, and as their concept of “done” becomes more stringent, the team will become increasingly efficient.

Who Determines the Definition of Done Criteria?

The Definition of Done (DoD) should be one of the main starting points for Agile projects involving different stakeholders. The Product Owner and the team should always work together to agree on a clear “Definition of Done,” which will define whether a story/feature is ready as an increment delivery.

Interpretation of the Definition of Done Among Teams

Over the years, I have seen different interpretations of how to determine DoD. One common way to do it is to state that the development is finished once the testing is conducted. In this case, the team says, “a story is completed only when the tester in the team says it’s done.” If that is how you decide to do it in your team, you must ensure the tester is the focal point of the PO, so the team understands the PO’s intentions.

For me, this method of “Tester” approval is less effective for several reasons:

  • Single point of failure.

  • Developers may ignore their responsibilities

  • It creates a logical separation of testers and developers.

The more common (and far more effective) way to implement DoD is to use checklists that specify the criteria and requirements needed for completing a feature, sprint, or story.

Why Do We Need a Definition of Done?

  • It provides a simple checklist, which simplifies the development activities of coding, estimation, design etc.

  • It helps increase visibility and transparency.

  • It reduces rework costs once a feature or story has been accepted as “done.”

  • It helps create a culture of collaboration and increases communication.

  • It provides clear contact between the team and the customer to limit the risk of misunderstanding and assumptions that can lead to conflicts.

  • It allows the organization, especially engineering teams, to understand what is expected of them once they make commitments at the beginning of the sprint.

  • It increases the efficiency of the entire process because it provides clear criteria about what needs to be completed to finish an artifact, such as a feature, sprint, or user story.

A reliable, scalable, secure, and high performing test execution cloud that empowers development and testing teams to accelerate their release cycles.Try this online Selenium Automation Grid to run your browser automation testing scripts.

On What Levels Can We Use It?

The Definition of Done is mostly associated with user stories but can also be used in other areas such as sprints and features.

Definition of Done for a feature

The following criteria may determine the Definition of Done for a feature:

  • The customer approves all-important stories relevant to this feature.

  • There is a full, stable working version ready for release.

  • All bugs are resolved technically or in an “acceptable” state with the approval of the PO.

  • Feature documentation is complete, including user manuals, release notes, and known issues.

Definition of Done for a Sprint

The Definition of Done for a development sprint may be determined by the following criteria:

  • The sprint goal is accomplished.

  • All user stories are completed and approved.

  • Release notes are written and documented.

  • Automated tests written, executed and passed.

Definition of Done for a user story

The following criteria may determine the Definition of Done for a user story:

  • All tasks necessary to implement and test the selected story have been identified, estimated and approved by the team.

  • The code was integrated into the main branch.

  • All bugs associated with the story have been reported and verified.

  • All coding and testing activities are complete.

  • Every story added to the sprint backlog is fully understood and approved by the team and has all the elements of a user story, including acceptance criteria, acceptance tests, etc.

Verifying that the completed activities match these requirements ensures that you are providing features that are actually done, not just in terms of functionality, but also in regard to quality. Adhering to this concept of “done” will guarantee that you do not neglect critical actions that determine the quality of the delivery, hence reducing debt buildup. Regardless of best practices and intentions, technological debt is frequently unavoidable. You can avoid getting in over your head as long as the team is aware of it, discusses honestly about it, and has a strategy in place to pay it off as soon as possible.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .