« IoC/DI for the Time Poor | Main | Be careful of what you measure »

October 26, 2004

Test Suite Smells

Not running the risk of being accused of thinking of an original idea for a blog entry, this is yet another set of Agile-scented "smells" - inspired by a blog written by Jon on Development Environment Smells... welcome to Test Suite Smells. Believe me, I did try for a pun around suite/sweet, but couldn't crack it :-(

Spoiler Alert! A lot of what follows are motherhood statements - don't say I didn't warn you?

SMELL #1: Suite takes too long to run

"Well, DUH!", I hear you say. And you're right. Everyone knows a long running suite is a smell. However, the jury is still out on how long is too long. Personally, my limit would be around 10 minutes or so, anything longer and developers will start not running the whole suite, usually with SMELL #2-style consequences.

Usually caused by an abundance of long running tests. These tests are more often than not integration/system tests and may not even be suitable for the test suite. I'm not idealistic enough to think the suite should contain nothing but "pure" unit tests, but the team should aim to keep the unit test:integration test ratio fairly high (10:1?, 50:1?).

SMELL #2: CI environment is broken more often than not

This is most often a symptom of a long running test suite. If it takes some large amount of time to run the suite, the developers will start to be selective about when/how often they run the whole suite and "roll the dice" the rest of the time by just running a handful of tests around the particular pieces of code they're currently working with. Depending on how well the tests have been developed, this approach may/may not be successful. When it's not successful, it's usually related to SMELL #3.

Of course, it could also just mean you have a lazy, undisciplined pack of rabid developer hounds...

SMELL #3: 1 code logic error leads to >1 test class breakage

If you're truly testing everything in one place and one place only, then a single logic error introduced into your code should only result in tests breaking in one part of your suite. If you find yourself in the position where such a breakage ripples through the test suite like the aftershocks of an earthquake (breaking tests all along the way), then your test suite contains a large amount of duplicate testing and not enough usage of mocks/stubs to remove dependencies on objects tested in other places.

SMELL #4: Suite cannot run without a network cable

Most enterprise-scale systems require connections to external applications for various purposes; payment gateways, legacy systems, authentication servers, etc. If you're test suite requires actual live connections to these systems to run, then you've definitely crossed over the boudary into system testing. Such testing, whilst invaluable, should be moved away from the test suite.

A suite could be run in a considerably more naked mode than just a standalone workstation. You could craft a test suite that requires nothing more than the JVM, JUnit, the code base and test suite... no J2EE container, no RDMBS, no 3rd-party libraries - mocks/stubs all around. This puritan approach would certainly lead to a suite which ran like grease lightning, but risk entering the realms of SMELL #5.

SMELL #5: Over reliance on mock testing

Much like those interesting folk who prefer to immerse themselves in all forms of electronic communication to the complete detriment of any human communication, if you bind your test suite completely into a cocoon of carefully hand-crafted mock objects and interface stubs, you run the risk of forgetting what reality is really like.

The underlying assumption behind the usage of mock/stub to stand in for real implementations is that the standins do perform in a fashion largely identical to the real thing. If this is not the case, or if you don't have enough tests using the real implementation, then reality will tend to remind you of this mistmatch at the most inconvenient times and in the most embarassing of circumstances.

SMELL #6: Under reliance on mock testing

Obviously the flip side of the previous smell, if you insist on real versions of every dependency your application communicates with, then you're asking for a world of hurt in the form of a bloated, sluggish, brittle test suite.

After all, if you have one test proving that your RDBMS can return dataset A for query B, then cannot we assume it will do the same thing in each of the 10 other tests in various parts of your suite that present it with the same query?

Obviously you walk a fine line with the decision on how much reliance to place on techniques like mock testing; too much and you run the risk of missing the strength of feedback from a reality-based environment, too little and the test suite quickly becomes unmanageable. By constantly listening to the creaks and groans of the test suite and bitches and moans of the developers, you should be able to get a feeling for where the pain points are in relation to the test suite. Once thusly informed, prioritize and triage!

Conclusion

If you move in an Agile world, you put as much effort into writing test code as you do with production-destined code. In fact, prior to deployment into whatever production environment is appropriate for your codebase, the development process will have been exercised the test suite considerably more often than the codebase itself. Therefore, having the knowledge and experience to evolve a suitable test suite architecture is vital to the ongoing agility of the project team.

IMO, it is unfortunate that relatively little time has been spent concentrating on this level of software design. Reams and gigabytes have been written on production code architecture, but scant regard to the test code equivalent. As with most things in software development, good teams will come up with good solutions to these issues, but I suspect many a project team working on it's first Agile project in earnest is slowly drowning under the weight of a poorly-design test suite - a situation that could be greatly assisted by the provision of some Patterns of Enterprise Test Suite Architecture-style scribblings.

Posted by Andy Marks at October 26, 2004 11:48 AM

Comments

Post a comment

Thanks for signing in, . Now you can comment. (sign out)

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)


Remember me?