Click here to Skip to main content
Click here to Skip to main content
Go to top

Writing Your First Unit Test

, 12 Apr 2005
Rate this:
Please Sign up or sign in to vote.
A primer on how to figure out where to start with unit testing.

Introduction

This is a brief article on getting started with writing unit tests. Rather than getting into coding and the 'how to's, of which there are numerous introductory tutorials, I want to discuss more the zen of unit testing. Hopefully you will find the information here to be of value--specifically, getting the most out of writing unit tests when you haven't ever done it before. I've written a lot of articles on unit testing, but frankly, I don't think they really provide enough guidance for the beginner.

Motivations

The first thing to clearly identify is, why you want to write unit tests. There are a variety of answers.

Good Answers

  • "So my code is tested more rigorously and therefore the product ships with fewer bugs."
  • "So when I make changes to the code, I can have a bunch of automated tests to make sure that I didn't break something somewhere."
  • "It makes my code better if I write the tests first, then write the code to pass the tests."
  • "It makes testing easier. I don't have to run the application over and over simply to get to the point where a block of code is executed."
  • "When something fails, writing a unit test for that failure guarantees that it's fixed for once and for all."

Bad Answers:

  • "Management wants me to start using Extreme Programming."
  • "I dunno. I heard it was something everyone else does."
  • "My code is broke, I'm way over deadline, and we're all desperate."

The good answers are good because they use unit testing to be proactive during the development process and have definite goals. The bad answers are bad because there is no clear motivation or the project is already in crisis mode. In my experience, unit testing will not bail a project out of crisis.

Where Are You In The Project?

This is an important question, because where you are in the project affects the kind of unit tests you might want to first start writing.

  • Not a line of code has been written.
  • We're near the beginning--code has been written, but the architecture is still in flux (heck, even the specifications are still in flux).
  • We're really ramping up now--new programmers coming on board, lots of modules being worked on concurrently, etc.
  • We've got a lot of code written, the architecture is stable, now we're gluing it all together and starting to do stress testing.
  • The application is ready to ship (or is shipping) and we're approaching the maintenance phase and looking at what features we want in version 3001.

If you're a beginner at writing unit tests, target your unit tests to be effective based on where you are in the project:

Not A Line Of Code Has Been Written

If you're here, you should consider the "write the unit test first" approach. I personally find this hard to do, because I think in terms of classes, hierarchies, and object graphs. I don't think in terms of methods much anymore. But I used to. So let's remember how programming was done before objects:

  • We need a function that takes a, b, c as an input and does foo.
  • We need a function that takes x, y, z as an input and returns q.

Pretty much, this is all that modern methods do. They either fall into the category of "actors" that do things on the grand stage of the application, like saving data, firing events etc., or they are "thinkers", like calculating the interest in a loan.

Write A Validation Unit Test

The simplest thing to do, when you are writing the unit test first, is just write a test that validates that if you give good input, you get good output.

Thinkers

It's easiest to do this with "thinker" methods, because they don't do anything but process the input. You don't have to test anything else except the output.

Actors

Actors are harder to work with.

The Talker

Let's say you have a method that is supposed to open a socket, or a database connection. How do you test that? Well, with actors, you have to do something a bit backwards. You instead test to see how the method handles failures. For example, if you give the OpenConnection method good parameters, you really can't test that it succeeded. But if you give it bad parameters, you can test that it fails!

The Doer

We all like the expression "don't talk about it, do it!". Actors that actually do tangible things, like "query this table" or "save this data" or "instantiate this object" are more easily tested because you can almost always verify the results independently. Check that the member property has been initialized. Query the table or registry or XML config file yourself to make sure whether the data has been saved. Verify whether the data that you get back is what your unit test put into the database. Things like that.

Write Bad Input Tests

Garbage in, garbage out, right? Wrong! After you write your unit test that validates good operation, before you write your code, write some unit tests that give the method some bad data. Bad connection strings. Things that result in divide by zero. Strings that are longer than you allocated space for in the database field. Stuff like that. Your method should handle these conditions as gracefully as possible. Throw an exception (that's not very graceful, is it, but it's better than crashing the application). Invoke an error handler. See how unit tests can get you thinking about "what if" scenarios, and how you want to handle them?

We're near the beginning--code has been written, but the architecture is still in flux (heck, even the specifications are still in flux)

If you're here, I would suggest that you primarily focus on validation unit testing (see above), because as the architecture keeps changing, you want to make sure that you don't break existing code when you implement the architectural changes. If you're like me, you might not have been paying much attention to testing bad input parameters. But if your still working with prototype code, today's bad input parameters might be tomorrow's valid inputs, so in my experience, until things have settled down a bit, you really want to focus primarily on validating your core code.

Identify Stable, Core Code

What code is the most stable? The least susceptible to architecture/specification changes? This is your core code, and you can start writing the unit tests I described above. You might also consider:

Code Path Testing

Code path testing is great because you can write tests that verify that all the branches of your method are executed. Even better, you can discover branches that are dead!

Identify Unstable, Higher Level Code

It's usually the higher level methods that are constantly being written and rewritten in your prototype. Unit testing is definitely valuable for prototypes, because marketing folk love to get hold of your prototype and entice the customer with a new gadget. The last thing you want is for the demo to break in front of the customer. Plus, prototype code has a nasty habit of being folded into production code. But because it's in such a state of flux, the most effective unit test is simply one that verifies that the code does what it should do when all the inputs are correct. After all, you should be working in a constrained and well defined environment for now rather than in the wild world.

We're really ramping up now--new programmers coming on board, lots of modules being worked on concurrently etc.

This is probably the most difficult space to be in with regards to unit tests. Everyone is working on their task list and everyone is screwing up the implementation, no matter how well the design was specified (unless you practically pseudo-code the design).

Verify The Design Goals

Your unit tests should verify the design goals. Forget about validation unit tests and error handling unit tests. Forget about code path unit tests. What you want to know is, does the code that Joe wrote do what I told him to do? This isn't method-by-method validation, but rather meta-level validation. You want to create the entire test fixture for a particular scenario, do whatever calls are necessary, and verify that Joe's code does the right stuff. For example, if you have a class that is supposed to generate the SQL dynamically for database transactions, and this code is supposed to be smart, so that it will handle foreign key references, etc., then write some meta-level tests that set up different database schemas and see how Joe's code handles things. Frankly, Joe should be the one writing the lower level validation and input unit tests. You want to make sure that the design goals are being met!

We've got a lot of code written, the architecture is stable, now we're gluing it all together and starting to do stress testing

Unit tests are great for stress testing because you can create stressful situations (hopefully preventing customer-inspired stressful situations). If you're at this phase and are just introducing unit testing, then start with something that speaks to where you are in the project--write unit tests that set up stress testing, monitor memory and resource utilization, and measure performance. But before you do all this, you will need to come up with some sort of a pass/fail criteria. How much memory should the application be using when I calculate pi to the 10,000th decimal place? How slow can I tolerate the database response time if there are a 100 million records to sort? That sort of thing.

The application is ready to ship (or is shipping) and we're approaching the maintenance phase and looking at what features we want in version 3001.

At this point, you should be able to give yourself a breather. Go through all the code, write validation tests, then bad input tests, then code path tests, then stress tests. If you're at this point, it should be pretty obvious whether the application meets the design goals or not, so you can pretty much ignore those tests. Do them when you have spec'd out the features in the next version. Actually, in the version after next, because the next version will be a service pack to fix all the bugs you find when you write all those unit tests!

What Does The Unit Test Require?

Most unit tests require some (or a lot) of setup. Ugh. This is the worst part about unit testing. You'll need repeatable data. You'll need files (or the lack of them) with specific content (or lack thereof). You may need connections to other machines, servers, databases, etc. Unit tests are not easy, mainly because you often end up writing another whole application (or so it seems) simply to do all the setup stuff!

Test The Simple Methods First

Instead of getting involved in complicated unit testing, go slowly. Identify methods for which you can write unit tests that don't require any setup. Then move on to unit tests that require only a little bit of setup. And so forth.

Consider Sequencing Tests

Sometimes it's easier to write a series of unit tests, where the first unit test does some setup for the next unit test, and so on. Sequencing the unit tests significantly reduces the amount of work you need to do for each individual unit test. The first unit test to fail in the sequence stops the entire sequence. Unfortunately, the only test engine that I know that does sequence testing is mine (AUT).

Organize Your Setup Code

Your setup code should be well organized. Don't copy and paste into each unit test--write static methods in a separate assembly so that it's easy for every unit test assembly to access the setup functions. In other words, treat your unit tests with the same respect and diligence as you would your product code.

Writing Your Second Unit Test

Most likely, your first unit test, unless it's really trivial, is going to require calling some other code in the application to do some setup stuff. Have unit tests been written for that code? This is a very easy way of figuring out what to do next.

Best Practices

Here's some things you should get into the habit of doing so that unit tests are cost effective:

  • When a unit test fails, fix the code right away.
  • All unit tests pass but the code crashed: write a unit test that demonstrates the failure.
  • Maintain your unit tests! Keep them up to date with your architecture/specification changes.
  • Write your code so that it's easy to write unit tests.
  • Refactor your code when necessary to make it more unit test friendly. This will almost always lead to better code as well.
  • Do not copy and paste from your application code that is needed to do setup!

Some Tools

There are several unit test tools out there, and several products that, when you build your application, will automatically run your unit tests. While searching for various links, I came across a very comprehensive web page.

Of course, there are two glaring omissions! AUT and MBUnit.

Further Reading

Besides my own articles on Advanced Unit Testing, take a look at a few interesting things I found:

License

This article has no explicit license attached to it but may contain usage terms in the article text or the download files themselves. If in doubt please contact the author via the discussion board below.

A list of licenses authors might use can be found here

Share

About the Author

Marc Clifton

United States United States
Marc is the creator of two open source projets, MyXaml, a declarative (XML) instantiation engine and the Advanced Unit Testing framework, and Interacx, a commercial n-tier RAD application suite.  Visit his website, www.marcclifton.com, where you will find many of his articles and his blog.
 
Marc lives in Philmont, NY.

Comments and Discussions

 
GeneralThoughts... PinmemberLeslie Sanford28-Jan-06 11:04 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.

| Advertise | Privacy | Mobile
Web02 | 2.8.140926.1 | Last Updated 12 Apr 2005
Article Copyright 2005 by Marc Clifton
Everything else Copyright © CodeProject, 1999-2014
Terms of Service
Layout: fixed | fluid