Installing node.js on Windows

Running node.js on windows now is not that difficult. I cheated and installed some binaries, rather than building from source via cygwin.

Here are the steps I took:

  • Download the binaries from http://node-js.prcn.co.cc/. Find the latest “complete” release which was 0.3.1_2 as of writing.
  • Unzip to whereever you like on your disk
  • Start cygwin and add node.exe to your path. On my system, I edit /home/smwhit/.bashrc and add the following line at the end export PATH=.:/cygdrive/c/dev/node/bin:`printenv PATH`. Note how you refer to your windows C drive as /cygdrive/[drive letter].
  • Follow the Hello World article at http://howtonode.org/hello-node

Hope that helps someone else get started

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)
Posted in Uncategorized | Comments Off on Installing node.js on Windows

Sqlite3, Fluent-NHibernate and .Net 4

Just a quick one, as I just lost an hour on this problem.

If you are trying to set up your environment to run with the above environment, you will likely see the following error message when you try and run your app:

Could not create the driver from NHibernate.Driver.SQLite20Driver, NHibernate, Version=2.1.2.4000, Culture=neutral, PublicKeyToken=aa95f207798dfdb4.

and deeper in the stack trace:

The IDbCommand and IDbConnection implementation in the assembly System.Data.SQLite could not be found. Ensure that the assembly System.Data.SQLite is located in the application directory or in the Global Assembly Cache. If the assembly is in the GAC, use element in the application configuration file to specify the full name of the assembly.

If like me, you really had added a reference to System.Data.Sqlite.dll, the you need to add the following to your app.config file:


  
   
  

  
Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)
Posted in Uncategorized | Comments Off on Sqlite3, Fluent-NHibernate and .Net 4

SpecFlow Execution Reports with XUnit

One of the nice things about open source software is that sometimes you don’t need to write code you think you need to.

I have found that it is possible to generate SpecFlow execution reports with assemblies built against xunit even though SpecFlow doesn’t explicitly support this.

The suprise is that xunit.console will generate nunit compatible Xml Results.

So all we need is –

  • xunit.console Bowling.Specflow.dll /nunit TestResult.xml
  • specflow nunitexecutionreport ..\..\Bowling.Specflow.csproj

Hope that helps someone.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)
Posted in Uncategorized | Comments Off on SpecFlow Execution Reports with XUnit

Saving Disconnected Entities with Linq2SQL

I struggled yesterday with saving modified entities back to the database after they were disconnected from their original DataContext. After much googling, I found the way forward is to ensure you have a version column in your database with a timestamp datatype, and map this with a Binary field in your mapping metadata. Then you can call an overload of Attach.

A complete working example below, as simple as I can make it. Hope that helps somebody.

using System.Data.Linq;
using System.Data.Linq.Mapping;
using System.Linq;
using NUnit.Framework;

namespace CustomerSaver
{
    [TestFixture]
    public class Class1
    {
        [Test]
        public void TrySaveCustomer()
        {
            const string connectionString = @"server=.\SqlExpress; database=test;trusted_connection=true";
            Customer customer;

            using(DataContext readContext = new DataContext(connectionString))
            {
                customer = readContext.GetTable().First();
            }

            customer.Name = customer.Name + ".";
            using (DataContext saveContext = new DataContext(connectionString))
            {
                saveContext.GetTable().Attach( customer, true );
                saveContext.SubmitChanges();
            }
        }
    }

    [Table(Name="Customers")]
    public class Customer
    {
        [Column(IsPrimaryKey = true)]
        public int Id { get; set; }

        [Column]
        public string Name { get; set; }

        [Column(IsVersion = true, IsDbGenerated = true)]
        public Binary Version { get; set; } //timestamp column in sql server
    }
}
Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)
Posted in Uncategorized | Comments Off on Saving Disconnected Entities with Linq2SQL

Running SpecFlow Reports from within Visual Studio

So, I’ve been loving SpecFlow for the past few months. There is a sweet command-line tool that will generate you a nice HTML report to show which Specs are passing etc. It’s a bit cumbersome to use (I’ve blogged about it’s usage before); I have managed to configure it to run inside of Visual Studio, and thought it might be useful for others.

This will run nunit tests, run specflow report against the nunit output,  and finally open the test generation file in the default browser.

You can configure External Tools from the Tools menu in Visual Studio.

Select the following options:

  • Command – C:\Source\Scripts\run_specflow_report.bat
  • Arguments – $(TargetName)$(TargetExt) $(ProjectDir)$(ProjectFileName) $(BinDir)TestResult.xml
  • Initial Directory – $(BinDir)
  • Check “Use Output Window”

You also need to create a batch file with the name specified in the Command field. Something along the lines of:

@echo off
nunit-console %1
if NOT %errorlevel% == 0 (
	echo "Error running tests - %errorlevel%"
	GOTO :exit
)
specflow.exe nunitexecutionreport %2 /xmlTestResult:%3
if NOT %errorlevel% == 0 (
	echo "Error generating report - %errorlevel%"
	GOTO :exit
)
if %errorlevel% ==0 TestResult.html
:exit

The only down-side is that you have to select the Specs project when you run the external tool. If you have a work-around, I’d love to see it. Hope that helps someone.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)
Posted in SpecFlow | Comments Off on Running SpecFlow Reports from within Visual Studio

Access the internet behind a proxy programmatically in .net

If you are behind a http proxy, and wanting your application to connect to a REST or Web Service, it is likely you will not be able to, and will get a Http Status code of 407 back.

However, you can pop the following into app.config/web.config, and your application will use the same proxy settings as Internet Explorer.


  
    
    
    
  


Hope that helps someone else.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)
Posted in Uncategorized | Comments Off on Access the internet behind a proxy programmatically in .net

Updated – Archive version of spreadsheet without formula

Earlier, I mentioned how I was creating an archive copy of a workbook.

I found a more efficient way of doing things that works a lot faster, and is a whole lot less code

Sub SaveFileWithoutFormulas()
  
    If SaveWorkbook Then       
        Dim wsCounter As Long
        Dim ws As Worksheet
        
        For wsCounter = 1 To ThisWorkbook.Worksheets.Count
            Set ws = Nothing
            Set ws = Sheets(wsCounter)
            Sheets(wsCounter).Select
            
            If ws.AutoFilterMode Then
                Application.ActiveSheet.ShowAllData
            End If
            
            Cells.Select
            Selection.Copy
            Selection.PasteSpecial Paste:=xlPasteValuesAndNumberFormats, Operation:= _
                xlNone, SkipBlanks:=False, Transpose:=False
        Next wsCounter    
    End If    
End Sub

Function SaveWorkbook()
    On Error GoTo ErrHandler:
    Dim workBookName As String
    workBookName = Replace(ThisWorkbook.Name, ".xls", "_")
    workBookName = workBookName & Format(Date, "ddmmmmyyy") & ".xls"
    
    ActiveWorkbook.SaveAs Filename:= _
        workBookName _
        , FileFormat:=xlNormal, Password:="", WriteResPassword:="", _
        ReadOnlyRecommended:=False, CreateBackup:=False
    SaveWorkbook = True
    Exit Function
    
ErrHandler:
    SaveWorkbook = False
End Function
Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)
Posted in Uncategorized | Comments Off on Updated – Archive version of spreadsheet without formula

Open Source Contributions

Well, I feeling slightly pleased with myself. Last time, I wrote that I had found it difficult to discover how outputting reports in SpecFlow worked. Well, following Ayende’s motto of “I completely agree, can you send me a patch”, I did just that.

I have had my pull request on github accepted, and changes providing some extra documentation are now in the master SpecFlow branch.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)
Posted in Uncategorized | Comments Off on Open Source Contributions

BDD with SpecFlow – Reporting the Results

It took me a while to figure out how to generate reports in SpecFlow. After all, now we have all these lovely Specification written and the code written to satisfy our requirements, wouldn’t it be nice to generate a report for Project Managers/Customers/Whoever so they can see progress on certain areas of functionality.
There is a command line tool, specflow.exe that can generate two types of reports. The documentation could do with some work, as it is not clear what some of the parameters refer to. I had to look at the source code to find what some parameters meant and what the defaults were. (I am planning on submitting a patch to the team with improved help docs!)

It is useful to put the path for specflow into your path to allow. By default, the installation directoy is C:\Program Files\TechTalk\SpecFlow

Test Execution Report

This is a high level summary showing how many scenarios have passed or failed, as well as timings of the steps. This report uses the NUnit Results XML file as it’s input which wasn’t obvious.

The syntax for this command is:

specflow nunitexecutionreport VisualStudioProjectFile /xmlTestResult:PathToNunitTestResultXMLFile

In my case, I execute:

specflow nunitexecutionreport Specs.csproj /xmlTestResult:bin\debug\TestResult.xml

Step Definition Report

This seems to be a way of seeing what steps you have written, and how many times they are called. We were able to identify a number of steps that had become obsolete during the refactoring of some specs.

The syntax for this command is:

specflow stepdefinitionreport VisualStudioProjectFile

In my case, I execute:

specflow StepDefinitionReport Specs.csproj

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)
Posted in Uncategorized | 1 Comment

Thoughts on TDD, Automated Testing and BDD with Specflow

This week, I have had a bit of an epiphany with regard to how we should approach automated testing our projects.

Misgivings about TDD style testing

  • The value of TDD style testing comes as an aid to code design rather than testing. If you are exploring designs and then writing tests afterwards, where is the net benefit? They do give you confidence that the changes you make are not breaking existing functionality, such as inadvertently inverting some boolean logic, that might otherwise have slipped through.
  • The value of unit tests utilising mocks is dubious. They are often painful to set-up, inevitably difficult to read, and a pain to maintain. Unless you refactor your code in a very strict way, your tests will need be rewritten as the API they call completely changes.
  • Is there really much inherent benefit in isolating everything but one object and mocking interactions with other objects? Again, in my opinion, not unless you are driving the design of your code through mocking.
  • How can you know that you have written the correct tests? It is “easy” to get 100% code coverage, but not so easy to see that you have covered your code with relevant and correct assertions.
  • I am simply too lazy, not smart/disciplined enough to write unit tests that strictly satisfy the definitions of a unit.
  • What do I want from automated testing?

  • A safety net that will protect our code from changes or additions unintentionally affecting algorithms
  • Mainstream tooling and language support (e.g. NUnit, mocking frameworks, integration with Visual Studio and Continuous Integration software)
  • A focus on the desired behaviour of the software rather than infrastructure and internals. This forces developers and domain experts to think about what the specifications (specs) of the software should be immediately.
  • Tests/Specs to be as frictionless as possible to write. Tests that are difficult to write, in my experience don’t get written. As much as possible, avoid complex data setup, external services to encourage people to write tests.
  • Ability to generate test fixtures using a Domain Specific Language any domain expert/developer can understand
  • Use as many of our concrete implementations of objects where possible, but retain the option to mock our externals interfaces such as database, file system, web services. Much behaviour or functionality in applications is expressed via the interaction of multiple moving parts. Our tests should reflect this.
  • Ability to generate reports on the number of specs implemented and passing/failing in an readable and portable manner
  • The result is a series of BDD styles specs using a .Net project called SpecFlow, which are written in “english” using the gherkin DSL which comes from the Ruby Project Cucumber.

    I’m not going to describe the rationale behind BDD as others have done so before, better than I can, here and here

    SpecFlow is currently at version 1.2.0 and has versions for Visual Studio 2008 and 2010. The reason for the different versions is that there are specific Item Templates within Visual Studio to aid the creation of SpecFlow Features and Steps.

    Steve Sanderson has an excellent post which covers SpecFlow with ASP.net MVC which was my inspiration for exploring this framework in the first place. We have a Windows Forms client, so our Specs cover the layers immediately below the UI (presenter and downwards).

    References
    SpecFlow Home Page
    Cucumber Project
    Steve Sanderson Article

    Digg This
    Reddit This
    Stumble Now!
    Buzz This
    Vote on DZone
    Share on Facebook
    Bookmark this on Delicious
    Kick It on DotNetKicks.com
    Shout it
    Share on LinkedIn
    Bookmark this on Technorati
    Post on Twitter
    Google Buzz (aka. Google Reader)
    Posted in Uncategorized | 3 Comments