27–29 September 2017, Churchill College, Cambridge

So we're going "No QAs". How do we get the devs to do enough testing?

A 45 minute Case Study by:

Steve Wells


About this Case Study

This session follows on from a talk I gave last year arguing that dedicated QAs in a scrum/agile team create a mini-waterfall, reducing speed and quality. Afterwards, I was mobbed by any people asking the same question - "If we say 'No QAs', how do we get the devs to do enough testing?".

The session is based on an expanded version of a blog post I wrote on this topic.

Full team responsibility

The real key to successfully having a 'devs-only' team is to make sure that everyone on the team feels jointly responsible for the quality of what they are delivering. This is the main reason, in my opinion, for not having dedicated QA roles in the first place; having dedicated QAs allows developers to abdicate the responsibility for quality to someone else - "Yes, there was a major production bug, but QA should have picked that up". The problem doesn't go away by having 'devs-in-test'; just because they are automating testing, it does not mean 'coding-only' devs are not shirking responsibility.

I often hear the argument against this, which is "but my devs refuse to test, they just want to write code". Well, if they don't want to test, they are not developers. They are code monkeys. The same devs often say "I'm a developer (ie a coder) - I don't do meetings". In this day and age, developers need to have a much broader view of their work than this; there's more to developing than typing code. How can they know they are building the right thing unless they are deeply involved in agreeing requirements with the PO? How can they be sure they've built it right unless they have fully tested it? Presumably these same developers are the ones constantly on their phones during planning sessions, and the ones who think standups and retros are a waste of time. They need to change their attitude to be team players with joint responsibility for everything the team does. Start with the recruitment process - as Simon Sinek says, "Don't recruit for skills, recruit for attitude. You can always teach skills". Make sure you employ good all round team players with great attitudes, not code monkeys.

The real benefit of having the team jointly responsible is that product quality actually improves, which is probably counter-intuitive when getting rid of QA, but something I have seen in every team that has taken this approach. Speed of delivery also increases as the process can be more tailored to how the team wants to deliver. Having a separate QA role means having a QA phase, which, let's face it - is an agile anti-pattern; it's a waterfall!

Seeing a truly responsible team in action is great; any issues become a 'team fail'; no one gets blamed, problems just get fixed. Developers actually want to deliver high quality stuff, and empowering them by giving them full responsibility for that results in highly efficient, motivated teams that produce high quality product. Spoon feeding them the requirements ("Build this. Like this") and allowing them to shuffle responsibility for testing onto someone else just leads to poor quality and low morale.

Interestingly, in the original version of the Scrum Guide, there was a load of stuff that subsequently got removed about pigs and chickens. It was all about involvement and commitment; chickens are involved in a breakfast (they lay the eggs), but pigs are rather more committed (they supply the bacon!). A truly cross-functional, responsible team is committed; a role that is only partially responsible for delivery, be it only coding, or only testing is involved...


Automate as much of the testing as possible. If you only have devs, they will do this; there is nothing they like less than doing something manually if they can automate it - automation, after all, is still 'coding' for them (if they don't see it like this, convince them...)

Without QAs, the devs have a vested interest in speeding up testing by automation (which pure testers don't; why would they put themselves out of a job?) - they can get quicker feedback, and they don't have to do the same task more than once. The benefits of automation are obviously enormous, and I won't go into them here, but suffice it to say, it is more likely to happen without dedicated testers than with. One reason is that, with dedicated QAs, devs may not want to step on their toes by suggesting automated ways of testing ("who's the testing expert round here anyway?").

Play the 'shiny new stuff' card

Developers typically like to be up with the latest, shiny stuff, and being all over the latest testing tools and techniques is no exception.

So, play the shiny new stuff card. Use the latest testing frameworks. Make unit tests run automatically on check in, make functional and e2e tests run when merging feature branches into develop, and so on. Decide that all your tests have to run in less than 10 minutes.

Suddenly testing isn't a chore anymore - it's a technical problem to solve and before you know it there are dashboards everywhere with klaxons going off on failed builds, red and green build lights, stats on test timings and test failure rates per module, etc. etc. New CI tools come out of the woodwork faster than new JavaScript frameworks and testing becomes a fundamental aspect of the work. How satisfying for conversations in planning being as much about how a story can be tested as they are about how to build it.

You can also play the agile card - devs like agile as it puts them centre stage as empowered individuals who are in charge of their own destiny. Writing tests and testing as they go along allows them to fail as early as possible. It drives down the cost of transactions, and speeds up their deployment. Rapid deployment and data-driven development means rapid feedback that will find its way onto a dashboard somewhere and suddenly KPIs are centre stage, reinforcing the collective quality responsibility.

Do the maths

It's a bit simplistic, but I used this with my team when we had to replace a manual tester who was leaving. Firstly, we looked at things and realised that there probably wasn't enough work for a full time tester to just do testing all of the time. So, if we got a developer instead, and shared that manual testing, we end up with more development resource for a little manual testing 'pain'. As an example, in a team of 8, we get 7/8 of a person extra resource, and everybody has to do 1/8 of the testing.

The team agreed that was a fair trade. Of course, more developers means more automation, so less testing, so the maths is even more appealing. This, in fact, is what has happened; suddenly, there just isn't as much dedicated testing, but the number of bugs has reduced.

Also, if you think about it, having a dedicated QA is inefficient, as they cannot be busy all the time. As an example, let's say we have a team of 4 with one QA. Is the workload of the team as a whole always 75% dev work and 25% testing work, every hour of every day? Clearly not. Hence, at most times during a day, testers are causing a bottleneck (there is more than 25% test work) or they are under-used (there is less than 25% test work). If everyone is dev-ing and testing this inefficiency is removed.

Make it competitive

Having the devs review each other's code can get quite competitive as they try and find bugs. Encourage this! Obviously make sure it doesn't get personal or violent or destroy morale, but a bit of healthy competition goes a long way to finding all the issues in a bit of code.

This does need managing carefully, but is a great way to up the quality. It's the same reason why pair programming and code reviews result in better code; nobody likes someone else pointing out their errors.

The 'code monkeys' mentioned above, of course, will probably also be resistant to this as well - how dare someone else criticise their code?...


I don't actually believe that coders are genetically unable to test, or that testers are genetically unable to code. So, get them to teach each other their skill set. A deeper understanding of code can only help testers test, and an appreciation of testing strategies can only help coders write tests that cover more scenarios earlier.

Don't reinforce the role separation - send coders to testing meetups and conferences, get testers involved in development communities, and so on.

And, don't forget, this cross-skilling is, in fact, just 'learning', which is a key part of any team. How many of your front end team were React.js experts 3 years ago? None of them, it didn't exist - they had to learn that skill to do their job - so why not Selenium or exploratory testing strategies?

Things to avoid

  1. Rotating the QA role. This does not remove the problem; just bake testing and quality into the normal work of the team - there should not be separate 'dev' and 'test' phases.
  2. Allowing devs to refuse to test. Or indeed to refuse to do anything. Team players will do what needs to be done, whatever it is. If they refuse, they need coaching to acquire that mindset. If they still refuse, there are others out there who won't, so replace them. Always recruit with this in mind.
  3. 'Dev-in-test'. I'm not a fan of this role. If someone can develop, they can write tests. If a tester can write code, they can develop, surely? Yes, they may have specialist testing knowledge but they must impart that to the team so everyone knows how to code and test; I cannot accept there is a specific testing gene that means certain people are uniquely able to test better then anyone else. Employ 'T-shaped people' - they may have deep specialisation in one area, but have a broad range of skills.
  4. 'Dev-complete'. Avoid this phrase at all costs. It merely means "I believe I have finished coding until QA find some issues and it comes back to me for fixing (which I fully expect to happen)". Complete is a binary concept. If something has been shipped, it is complete. If not, it isn't done yet. Avoid this term at all costs.

About the Speaker

Steve has been an agile coach and scrum master for many years, active in the field and an active speaker at conferences and meetups. Having been instrumental in the digital transformation at Sky, Steve now works on M&S.com, the digital arm of one of the country's oldest and most famous retail brands.


Tickets are available now

buy tickets

See the full programme

full programme

Join our mailing list

We respect your email privacy

Agile Cambridge was organised by Software Acumen, based in Cambridge, England. We build communities to help technology professionals grow.

© 2013-2018 Software Acumen. Agile Cambridge and the Agile Cambridge logos are trademarks of Software Acumen LTD.

Software Acumen Limited | Registered in Cardiff | Registration No. 05210967 | VAT No. 896 7958 26