Developer Blog Banter - How do you test your applications?

David started a nice article series: The developer banter - this time about testing

"How do you organise your tests. Do you separate your unit tests, integration tests and UI tests into separate projects?
Do you do anything specific to keep track of your tests?
What naming conventions do you use? Do you run them before a check in or is that what the build server is for?"

Interesting question @DavidBurela


image
My way of testing changes all the time and currently I am doing it like described below.

 

Goal of my tests

ASAP something was changed that was not expected as application behavior, we get an alert --> email

I use TFS (msbuild, mstest) as my CI server, with builds on each check in.
I try to write my tests as expectations on how the application works.
I think everyone agrees to automate tests  these days…

 

Organization

I organize my tests similar to Liam, by splitting them into slow and fast but I name with "blurring" names like:

  • Unit tests (fast) and
  • Integration tests (slow and not independent)

 

 

Naming of projects

I name my test projects like that

  • Tests.Unit.SalaryService
  • Tests.Integration.SalaryService

image  
Figure: Sample solution structure with a WCF service, a MVC web app, Silverlight clients and some Common projects and even a Silverlight Unit test project
On this blog post I describe how we test our WCF service

 

 

Naming of test methods

Regarding naming of my tests I use different approaches:

  1. SUT_Scenario_ExpectedResult    as per Roy Osherove - Naming standards for unit tests
  2. SUT_Context_FeatureName

 

Examples for those tests

1. SUT_Scenario_ExpectedResult

        [TestMethod]
        public void UserConnectionHelper_StoreNewUserConnection_ReturnsNewObject()
        {
            UserConnectionHelper userConnectionHelper = new UserConnectionHelper();

            var result = 
            userConnectionHelper.StoreNewUserConnection(IrrelevantString, IrrelevantString, IrrelevantLong,
                IrrelevantString, IrrelevantBool, IrrelevantBool, IrrelevantString, IrrelevantBool, IrrelevantBool);
                
            Assert.IsNotNull(result);
            
        }  

 

2. SUT_Context_FeatureName

        [TestMethod]
        [UseForDocumentation]
        public void SequentialAuction_LimitBids__OnlyBidFullIncrements()
        {
            FakeLot1.ActualHighestBidder = Bob;
            FakeLot1.ActualPrice = 100;
            FakeLot1.BidIncrement = 50;

            Anna.enters_LimitBid(Lot1, 110);

            ProcessBids();

            Expect_HoldingBidder_AndPrice(Bob, Lot1, 100);

        }  

Note: I am not using a BDD framework (yet), but the idea is that our business people can read those tests easily and we can discuss the different scenarios.

I have a [UseForDocumentation] attribute, that I use to automatically generating HTML documentation from these tests with T4 templates. Again I should probably use a BDD framework and let the framework do that for me.
When I generate that documentation, I basically strip out all C# noise like (".", ";", etc) and apply some nice CSS formatting.

This is currently a work in progress, and highly depending from the other devs and our Product owner.
If it works, we keep it ;-)

Mocking?

My current favorite mocking framework is moq over Rhinomocks. I didn't try nSubstitue yet (the syntax looks very nice).

 

 

How do we write tests?

Some Test first, some Test after, some TDD style, depending on who and what.

Rule of thumb

  • We try to introduce new features now only if there is enough test coverage around that area.
  • We approach new core business functionality TDD style.

 

Specials about our testing

#1 Testing Silverlight

 hammer-nail 
We have 1 big problem though: Testing automagically Silverlight clients
We have some tests done in the Silverlight Unit test framework, but those tests are not running on the build server, which means they are almost useless.

 

#2 Stresstest over night

In the current project we focus a lot on performance of the system (Silverlight clients talking to WCF service). So what we do is:
Every night we run a stresstest.
I start up perfmon with some interesting counters and log those every 15seconds to a database. Then we start up 500 console clients + some Silverlight clients and hammer the application hard for 2 hours.
Next morning we have a nice PDF report (SSRS) in our inbox that tells us how the stresstest went (min, max, avg CPU, IO, Network, Memory, connections, …)
Blog post coming up

#3 UI tests 
No web UI tests in this project (yet), because our MVC views don't have any logic. But we are testing the controllers.

 

All this is subject to change in the next months, but good to have in a blog post.
Lets see how we do it in 1 year!

Latest Posts

Popular Posts