As a small company trying to get started, it was important that we used our ability to react quickly as one of our strengths. Speed was very important in the development team, but one of the ways we achieved that speed was by taking on technical debt.
Now that we are larger, speed is less important and quality and predictability are the main requirements. Our software code base is larger and areas of inflexible design have surfaced. We need to re-factor into smaller replaceable components so we can keep the product fresh and modern. We can count this as another of our areas of technical debt that we need to reduce. We need to find ways of reducing our debt whilst maintaining our ability to ship real benefits to customers.
One of the main techniques we plan to use to improve our quality and reduce technical debt is Test Driven Development. We’ve been keenly watching the Uncle Bob – Clean Coders videos as his use of TDD has been a revelation.
As a test team we haven’t made as many changes as we could. We have started the process of automating our testing, but too much is still manual. We spend a long time with our regression tests; often not finding any bugs, but using as much as 2 weeks of our time. When we’re aiming for a monthly release cycle (or shorter) this is more than we can afford. It was noted from hearing Uncle Bob talk about his work on FitNesse, “if the tests pass, we ship”. That has to be our goal as a development and testing team; having such a comprehensive set of automated tests, that if they pass, we ship.
A constant battle in project management, between stakeholders and the project team is in understanding when a project will be complete. Agile methodologies attempt to address that issue, by offering a new version regularly and letting the product owner decide when a version is complete enough to ship. However, unless the whole business is truly agile, then this doesn’t really work; marketing want to announce what’s coming for the next year, and expectations are set with customers. No-one wants an estimate or to be told to wait 2 weeks at a time to see if the system is what they want. What they really want is a commitment! So whilst we can work in 2 weekly sprints, and can release software to our customers monthly, we need to plan for something like quarterly commitments. To have any chance of keeping these commitments, we need to use some of the more traditional project management tools to manage change (and restrict it). We need to remove risk, and we need to estimate the size of tasks more accurately.
Whilst our stated aim has been for a monthly release, these have been stretching out to nearer 3 months and that starts increasing the pressure to delay the next release. No-one wants to take the risk that if their sponsored feature misses the current release that it will then be a further 3 months before it can be released.
One of the tools that we can use to change this is to make our behaviour visible. We have started to track our key metrics, what date are we aiming to have developed all our new code for? How many features and fixes are still in development? How much have we already passed to the test team? Once we have this information, we can measure our progress, track our confidence for making the release and then adjust our plans accordingly. By using a daily stand-up with the key members of the team, we keep visibility and remind everyone of our targets. We can also display these metrics in a public place so that everyone is aware of them.
One of the thought experiments we can run is to find out which parts of our process would stop us releasing software on a weekly basis. At the moment that is our manual testing, but as we automate more of that we should keep our eyes on what else might be slowing us down. If we get to a point where we could release weekly, then the next step is to look at what would prevent us releasing daily.
One of the big benefits of a daily, automatically built, tested and ready for release version of our software would be within our support team. Our support developers still spend a lot of their time producing hot fixes and sending them to customers. These are manually built, put together and tested by the developer who made the code change. Automating this process would free up more time to help other customers and reduce the occasional mistakes we make following this manual process.
If you’ve followed all 3 of these posts on progress then hopefully you’ll have seen how far we have come as a team; from sound principles, but poor practice, to a more modern approach with a vision of where we still need to improve. It really is a credit to the team members, both past and present that have helped drive the change whilst maintaining a high quality product for our customers to use and our business to sell.