Visual Studio


Acceptance Tests for User Stories allow the Product Owner to more easily say whether or not they accept a Story as ‘Done’. Also, Acceptance Tests can be used in Behaviour Driven Development (BDD) to provide an “outside in” development process that complements the “inside out” coding style of Test Driven Development (TDD).

SpecFlow brings Cucumber BDD to .NET without the need for a Ruby intermediary like IronRuby.

In your Tests project add a Features folder. SpecFlow installs some templates into VS.NET so add a new SpecFlowFeature and fill it out like the following example, InterestRatings.feature :

Feature: Interest Ratings
	In order to manage the Interest Ratings
	As a Trader
	I want an Interest Ratings screen

Scenario: View the Interest Ratings
	Given a repository of Interest Rating records
	When I view the Interest Rating screen 
	Then the full list is created

Scenario: Create an Interest Ratings record
	Given a repository of Interest Rating records
	When I create an Interest Rating with name Test 
	And Interest Rating code 4
	Then the Interest Rating is saved to the repository with 
                 name Test and code 4

The scenarios are the tests. The format is a clear Given-When-Then description. As you create the scenario the .feature.cs will be updated for you.

Now you need to link up the statements in your scenario to steps that the unit test framework can execute. Create a Steps folder under Features and add a SpecFlowStepDefinition. You’ll find the generated file has some useful placeholders to get you started. Here, for example, is InterestSteps.cs :

    public class InterestSteps
        private IInterestRatingService interestService;
        private InterestRatingViewModel interestRatingViewModel;
        private InterestRating rating;
        private Mock<IValidationService> validationService 
                      = new Mock<IValidationService>();
        private Mock<ILoadScreen> loadScreen 
                      = new Mock<ILoadScreen>();
        private Mock<IServiceLocator> serviceLocator 
                      = new Mock<IServiceLocator>();
        private Mock<IRepository<InterestRating>> interestRepository 
                      = new Mock<IRepository<InterestRating>>();
        private List<InterestRating> interestRatings;

        [Given("a repository of Interest Rating records")]
        public void GivenARepositoryOfInterestRatingRecords()
            Mock<IValidator> validator = new Mock<IValidator>();
                     .Setup(s => s.GetInstance<IValidationService>())
                     .Setup(v => v.GetValidator(
                     .Setup(s => s.GetInstance<IEventAggregator>())
                     .Returns(new EventAggregator());
                     .Setup(s => s.GetInstance<ILoadScreen>())
            ServiceLocator.SetLocatorProvider(() => this.serviceLocator.Object);
            this.interestRatings = 
                     .Setup(s => s.GetAll())
                  =  new InterestRatingService(

        [When("I view the Interest Rating screen")]
        public void WhenIViewTheInterestRatingScreen()
                  = new InterestRatingViewModel(this.interestService);

        [When("I create an Interest Rating with name (.*)")]
        public void WhenICreateAnInterestRatingWithName(string name)
                  = new InterestRatingViewModel(this.interestService);
                  = this.interestRatingViewModel
                                        .InterestRatings.Count - 1];
            this.rating.InterestRatingName = name;

        [When("Interest Rating code (.*)")]
        public void AndInterestRatingCode(int code)
            this.rating.InterestRatingCode = code;

        [Then("the full list is created")]
        public void ThenTheFullListIsCreated()
                      == this.interestRatingViewModel

        [Then("the Interest Rating is saved to the repository 
         with name (.*) and code (.*)")]
        public void ThenTheInterestRatingIsSavedToTheRepository(
                           string name, int code)
            InterestRating rating 
                = (from m in this.interestRatingViewModel.InterestRatings
                   where m.InterestRatingName.Equals(name)
                   select m).Single();

                "The interest rating name was not saved.");

                rating.InterestRatingCode == code,
                "The interest rating code was not saved.");

In particular, notice the reuse of steps, for example GivenARepositoryOfInterestRatingRecords(), and the use of variable placeholders like (.*) to allow the passing of variables into the tests.

BDD wraps TDD. A reasonable flow would be to start with the Story, write up the Acceptance Tests and sketch out some of the steps. As you sketch out the steps you can see what unit tests you need so you go and develop the code using TDD. Once your code is ready and all the unit tests are passing you can integrate the layers with the BDD tests and when those are passing you have fulfilled your Acceptance Test.

Gherkin parsers for SpecFlow are on the way as are VS.NET language plugins (Cuke4VS – currently this crashes my VS.NET 2008).

Cuke4Nuke is another Cucumber port that is worth looking at.

The readability of the Features makes it easy to take Acceptance Tests from User Stories so that the Product Owner and Stakeholders can see what the system is doing. The “outside in” nature of the creating the code gives focus to fulfilling the User Story.


I’ve created a dark scheme for VS.NET as I was finding the white screen was too bright (yes, I did also turn down the brightness on my screen). It’s based on the Aloha scheme.

I’ve saved the .vssettings. You’ll also need the Consolas and Dina fonts (Consolas is probably already installed).

To use these settings go to Tools > Import and Export Settings … .

I find this much more legible than the VS.NET defaults, especially for highlighting search results and for editing Xaml. You can port it to VS.NET 2010, too.


There’re a couple of downloads of these settings each month. I’d be interested in seeing any tweaks people may have so I’ve placed a copy on GitHub for folks to fork:


I’ve put together a couple of scripts, poached from the interWebs, that will give those VS.NET keyboard ninjas some more shortcut-fu. Rather than relying on different, incomplete cheatsheets for each of your plugin’s keyboard shortcuts, this macro will find all of your current shortcuts and present them in a single table.

From VS.NET 2008 open the Macros IDE:
Tools > Macros > Macros IDE

Once in the Macros IDE, create a new Module:
Project Explorer > My Macros (context menu) > Add > Add Module ("KeyboardShortcuts")

Copy the following over the file’s contents:

Imports EnvDTE
Imports System.Diagnostics

Public Module KeyboardShortcuts

    Sub ListKeyboardShortcuts()
        Dim i As Integer
        Dim j As Integer
        Dim pane As OutputWindowPane = GetOutputWindowPane("Commands")
        Dim keys As System.Array

        pane.OutputString("<font face=arial>")
        pane.OutputString("<table border=1 cellspacing=0 cellpadding=2 bgcolor=f0f0ff>" + Chr(10))
        pane.OutputString("<tr><th colspan=2 bgcolor=d0d0e0>Keyboard Mappings</th></tr>" + Chr(10))
        pane.OutputString("<tr><th bgcolor=e0e0f0>Action</th>")
        pane.OutputString("<th bgcolor=e0e0f0>Key</th></tr>" + Chr(10))

        For i = 1 To DTE.Commands.Count
            keys = DTE.Commands.Item(i).Bindings
            If keys.Length > 0 Then

                'DTE.Commands.Item(i).Name() is sometimes blank.
                'We will print an m-dash in this case, as printing a blank table cell is visually
                'misleading, as such a cell has no borders, making it appear to be attached to
                'another cell.
                If DTE.Commands.Item(i).Name() <> "" Then
                    pane.OutputString("<td valign=top>" + DTE.Commands.Item(i).Name())
                End If

                For j = 0 To keys.Length - 1
                    If j > 0 Then
                    End If
                pane.OutputString("</td></tr>" + Chr(10))
            End If


    End Sub

End Module

You’ll notice the error squiggle under the GetOutputWindowPane() call. We have to add that Utility in (it used to come with the Samples):
Project Explorer > My Macros (context menu) > Add > Add Module ("Utilities")

Imports System
Imports EnvDTE
Imports EnvDTE80
Imports EnvDTE90
Imports System.Diagnostics

Public Module Utilities

    ' This function retrieves the output window pane
    Function GetOutputWindowPane(ByVal Name As String, Optional ByVal show As Boolean = True) As OutputWindowPane
        Dim window As Window
        Dim outputWindow As OutputWindow
        Dim outputWindowPane As OutputWindowPane

        window = DTE.Windows.Item(EnvDTE.Constants.vsWindowKindOutput)
        If show Then window.Visible = True
        outputWindow = window.Object
            outputWindowPane = outputWindow.OutputWindowPanes.Item(Name)
        Catch e As System.Exception
            outputWindowPane = outputWindow.OutputWindowPanes.Add(Name)
        End Try
        Return outputWindowPane
    End Function

End Module

To execute the macro go back to VS.NET:
Tools > Macros > Macros Explorer > ListKeyboardShortcuts (context menu) > Run

Yes, the output goes to the Output view and you have to cut-and-paste it into a .html but hey, if you want it to do more, update the script. 🙂 HTH


Setting up PartCover for .NET unit test code coverage takes a little work.

Trying to running PartCover.exe on x64

“Retrieving the COM class factory for component with CLSID {FB20430E-CDC9-45D7-8453-272268002E08} failed due to the following error: 80040153.”

This is because the COM class requested is 32-bit only and PartCover is running as a 64-bit app.

Setting environment for using Microsoft Visual Studio 2008 x86 tools.

C:\> "%VS90COMNTOOLS%\vsvars32"

Microsoft (R) .NET Framework CorFlags Conversion Tool.

C:\> CORFLAGS /32BIT+ /FORCE path\to\PartCover\PartCover.exe

Microsoft (R) .NET Framework Strong Name Utility

C:\> sn -k PartCover.Console.snk

-k [<keysize>] <outfile>
Generate a new key pair of the specified size and write it into <outfile>.

C:\> sn -R PartCover.exe PartCover.Console.snk

-R[a] <assembly> <infile>
Re-sign signed or partially signed assembly with the key pair in <infile>. If -Ra is used, hashes are recomputed for all files in the assembly.

Running PartCover

This is an example command line to create an Xml report of code coverage.

--target .\tools\NUnit\nunit-console-x86.exe
--target-work-dir .\src\Tests\
--target-args Alpha.Modules.Sam.Tests\bin\Alpha.Modules.Sam.Tests.dll
--output .\docs\PartCoverReport.xml
--include [Alpha*]*
--exclude [*Tests]*

Viewing the report

A new viewer for the reports is required (the stylesheets aren’t very good) – ReportGenerator.



A command script with argument placeholders.

--target .\tools\NUnit\nunit-console-x86.exe
--target-work-dir %1
--target-args %2
--output .\docs\PartCoverReport.xml
--include %3
--exclude [*Tests]*


Using PartCover from VS.NET

VS.NET > Tools > ExternalTools…

Title: PartCover
Command: $(SolutionDir)..\PartCover.cmd
Arguments: $(BinDir) $(BinDir)$(TargetName)$(TargetExt) [*Module*]*
Initial directory: $(SolutionDir)..
Use Output window [check]
Prompt for arguments [check]

Now there is a “PartCover” option in the Tools menu. Select a Test project and select Tools > PartCover. In the displyed command arguments change “Module” to the name of the project, i.e. “Sam”, and run.

The report is written out to: .\docs\PartCover\index.htm

(I would launch IE automatically but haven’t added that to the command script yet.)

x86/x64 shuffle

64 bit versus 32 bit is stinging us all over the place. Now we have some x64 dev machines (nice) and others using x86 machines. The solution is to set the build platform target of each and every project file to x86.

Also, the build server (TeamCity) is on an x64 box. DLLs that were being called had to be updated like NUnit and the Oracle data client. Then NBehave had to be updated as it’s version of NUnit wasn’t exactly the same so we have a nightly build of NBehave!

There are even problems with some of our tools like this problem with PartCover.

MSBuildCommunityTasks and Versioning

Auto-versioning our assemblies in TeamCity using MSBuild was more fiddly than I expected but ultimately a very clean implementation.

I’m using TeamCity for continuous integration and MSBuild to run our project’s solution file (.sln) as the build script. I wanted each of our builds to have a version number on the assembly (.exe) so that testers would know what they were dealing with. The format for the version number is the familiar dotted quad of Major.Minor.Build.Revision where the Build would be the build number from TeamCity and the Revision would be our version control system (VCS) revision number (our VCS is Perforce).

I use Scrum for Agile software construction, in an OpenUP project management process, so I’ve decided our Major number is the number of the release to the customer and the Minor is the iteration (Sprint) that produced the build. For example, my first build with this system was which means: we’ve yet to make a release to the customer (0); the build was from the eighth two-week Sprint (8); TeamCity has completed 282 builds; and Perforce is up to revision 11066.

MSBuildCommunityTasks includes an AssemblyInfo task. These are the steps I used to get this working:
(1) added MSBuildCommunityTasks to our “ExternalTools” folder in Perforce and submitted the MSBuild.Community.Tasks.dll and MSBuild.Community.Tasks.Targets files.
(2) integrated the MSBuildCommunityTasks items from step 1 into the solution’s “tools” folder.
(3) updated the MSBuild.Community.Tasks.Targets file to the correct path to the MSBuildCommunityTasksLib (in our “tools” folder from step 2).
(4) imported the Targets file from step 3 into the project file (.csproj).

<Import Project="$(MSBuildCommunityTasksTargets)" />

(5) created the properties for the parts of the version number so it is easy to update for releases and works both on the developer’s machine, where the Build number and Revision number are set to 0, and on TeamCity which provides environment variables for the TeamCity build number and the VCS revision number.

    <!-- Release -->
    <!-- Iteration -->
    <Build Condition="'$(BUILD_NUMBER)' != ''">$(BUILD_NUMBER)</Build>
    <Revision Condition="'$(BUILD_VCS_NUMBER)' != ''">$(BUILD_VCS_NUMBER)</Revision>

(6) deleted the current AssemblyInfo.cs from the source and the VCS.
(6) add the AssemblyInfo task that will autogenerate the AssemblyInfo.cs for each build

  <Target Name="BeforeBuild">
     <AssemblyInfo CodeLanguage="CS" 
                AssemblyDescription="My Product"
                AssemblyCompany="MyCompany Ltd"
                AssemblyCopyright="Copyright © MyCompany Ltd 2009"
                AssemblyVersion="$(Version)" />

Now I need to create a WiX project to build a release version and expose it as an artifact in TeamCity so our testers can easily pick up a build themselves.

Technorati Tags:

Starting Smart Client Software Factory development

Just a quick aide mémoire of the order of installation of the gubbins for SCSF developement:

1) SQL Server 2005
2) VS.NET 2005
3) Enterprise Library (January 2006)
4) Guidance Automation Extensions (February 2007 CTP)
5) Guidance Automation Toolkit (February 2007 CTP)
6) Composite UI Application Block (December 2005) [compile CAB before next step]
7) Smart Client Software Factory (June 2006)

Technorati Tags: