I really enjoyed Martin Fowler’s recent article on Test Coverage, and in particular I liked the graphic he included in it:
Image has gone :-(

This simple diagram sets out very clearly what test coverage is good at measuring and what it is not. I think that is a very important distinction to make because coverage is no indication of quality, just of what is and isn’t tested.

This appears to be at odds with a current wave of thinking (well, one that I’ve observed anyway) where a message comes down from senior managers that all projects must have at least xx% of test coverage. The repercussions of not achieving this target vary from audit actions to remedy the coverage all the way through to not allowing projects to be released until they hit the magic number…and the number varies wildly as well…but the thinking behind this is common, and flawed: High Test Coverage == High Quality Code.

Correlation != Causation

Correlation does not imply causation, but once that link has been pointed out it can be difficult to ignore without a solid understanding of what you are observing and why there is a correlation.

The correlation is there, plenty of projects with high test coverage are of excellent quality. These projects tend to be ones driven by the sensible implementation of good tests as and when required, quite likely following some kind of deliberate practice like Test Driven Development…but not necessarily. The thing to note here is not the quantity or coverage of the tests, it’s the use of good tests, when and where required.

The other side of this is where a target is set and the team has not been writing tests all along. They quite often panic when presented with a target and immediately look for shortcuts. The reason for this could be that testing is an often overlooked skill and, from my experience, a large number of developers mistakenly think that it’s beneath them. Not always the case, but this is a blog post so I feel justified in my over-generalisation.

I’m not saying that it’s easy to retro-fit good tests to a relatively untested code base. Far from it as without considering the testing up-front, it’s likely that the code is not organised/designed in such a way as to make testing easy. I’ve seen many teams jump on a product that promises to write all the tests they need, automatically, with minimal input. The problem with these tests is that they are rarely the high value tests that are needed, and they are applied mechanically across the project with no sense of context. More importantly, they remove the knowledge and understanding that a developer would gain by thinking through what tests are relevant and how the code should behave.

So, what should we do?

First of all, we should stop selling code coverage as a measure of quality. It’s not. However, it can be useful in spotting projects where the test coverage is particularly low or to spot trends in evolving code bases.

When coupled with other tools to identify complexity and rate of change (of the source code over time) then you can identify good areas to examine and see if you could write more tests. I imagine that there are even more interesting and useful insights to be had by combining other measures, feel free to comment on this if you have any good ones.

Aiming for 100% code coverage is a noble goal, but it should only be an aspiration and definitely not a target…and you still shouldn’t confuse it with a stamp of quality assurance.

Personally, I struggle to do anything other than TDD these days but I understand it’s not the path of least resistance and can be difficult to learn. Having said that, if you are not writing tests for your software then you are either incredibly over-confident, incredibly naive, or both.

End with an analogy

A Road network

Like any analogy, I expect this one will have many flaws but I’m going to use it anyway.

  • Would you consider that the road network would be improved just by adding more roads until we reached saturation?
  • Regardless of the quality of the roads?
  • Even when they were built where nobody lived, traveled, or wanted to go?