2 mins read 
By  Chamat Arambewela    Posted on Oct 14 2021 

For the final segment of this series, I’m going to talk about the first ten sprints (20 weeks) of our MVP build, the evolution of our release plan and some hard hard-learned lessons.

Our biggest challenge was around release cadences and getting test engineers out of the critical path of a release. We started by releasing every four weeks into Sandbox, and our test team invariably ended up overloaded on the ‘Friday before the release’. The weekend started becoming critical for bug fixes, and release Monday was pretty much when we finished full testing.. barely. 😱 The whole thing started smelling suspiciously like a waterfall development and delivery cycle! On top of that, our client needed features released faster, putting more pressure on quicker feature release and quicker regression testing. Streamlining internal releases and unblocking testing became critical for us to achieve high levels of agility.

We promoted a ‘fail fast’ culture within the sprint teams, creating an environment where Dev would release stories daily for testing. We assigned clear testing responsibilities – Dev to focus only on unit testing, with test engineering focusing only on story testing, not regression. Issues found would be fixed immediately over a quick conversation on Slack. The ethos here is to find and fix issues faster. We retained the cycle of internal releases for regression testing twice a week, but we deployed code into a separate AWS account for regression testing.

This also gave immediate benefits, uncovering unexpected environment-specific niggles that would’ve otherwise only come out during sandbox release regression. We also took a hard stance and avoided the temptation of releasing stories on the last two days prior to a release – difficult when chasing aggressive timelines, but crucial to preserve the sanity of our test engineering team!

Automated testing became mandatory – this goes without saying, no enterprise platform can ever successfully be regression tested without automated testing. The differentiator here is in the ‘type’ of tester involved. QA analysts write test cases and manually test a system. Test Engineers ‘write code to test code’ – writing test scripts, products to launch these scripts and products to evaluate outcomes.

Taking the middle-man out became crucial – daily releases and bi-weekly deployments meant having someone manually releasing to test environments was a bottleneck. Automated pipelines to ‘fetch and deploy’ were put in place, with a nifty ‘slack-bot’ informing the team every morning about the status of the deployment 😀 The levels of integration allowed by today’s tools are truly inspiring!

We have a constant feedback loop with the entire team – listening, tweaking and trialling. Finding the right balance for the team has to happen organically and from the inside out. We’re still a long way off from where we want to be, but with only five months into the build cycle, we’ve made some impressive progress, with 2-week release cycles and a firm focus on tightening this further!

Read more