PlasticSCM – Just Don’t

If you’re considering this source control tool.  Stop.  If you’re considering this for your development team, have mercy on them and stop.

I haven’t been confused by, annoyed by, dumbfounded by or angry at a source control tool this much since I stopped using Visual Source Safe 10 years ago.  This is one of the absolute worst development tools I’ve ever had forced on me.

The hilarious thing is – it isn’t even free, people are paying real money for this!.  Believe me, if it was my choice in tooling they’d have to pay me to use it.  If you are charging for source control these days you have to be pretty damn special and plasticSCM really is not special at anything at all.

I’m not even going to go into the details of why you shouldn’t use it as I simply cannot be bothered to have this tool waste any more of my time, but I feel the need to warn others.  This tool is incoherent.  Just use Git.  There is no benefit in this tool over Git, plus Git is free.


Git, Git, Git, Git, Git, Git.


TeamCity – Sharing Build Properties

Sharing parameters and properties can be a bit confusing for new users of TeamCity.  Lets use a simple example to illustrate how you can do this.

For sake of this explanation lets say that you have two build configurations that are essentially building the same thing but for different distribution uses, the source code is the same which would result in the same revision or commit of the source repository etc.  The only material difference is maybe some certificates or binary labels or identifiers.

Another scenario might be to integrate with an external system for deployment like Octopus Deploy and you want to kick that off in its own build configuration. Due to this you want the two build configurations to have the same build number.

Lets say you have an application called MyApp with a Debug, Release and Deploy build configurations and you want the Deploy configuration to use the Release build number.

So, to do this:

  1. Navigate to the Release build configuration and note down the buildTypeId from the browser URL (for this example lets say this is webrelease).
  2. Navigate to the Deploy build configuration
  3. Click on Edit Configuration Settings, then Dependencies, then Add Snapshot Dependency
  4. In the dialog that pops up select the check-box that corresponds to the Release build configuration and set any other relevant settings
  5. You now have “linked” the two build configurations which means you can use the %dep.x% parameter notation
  6. Now in the case to set the build number we navigate to the General Settings on the same Deploy build configuration
  7. In the Build Number Format text box we can enter

Voila!  The next execution of the Deploy build configuration will have the same build number as the Release configuration.  Go nuts with your params!



Xamarin iOS TeamCity Build

So this turned into a complete nightmare for a while.

TeamCity Mac Agent Push Installation

I started by trying to get TeamCity to do a push install of a new agent on my MacBook Pro and very quickly things got very, very ugly.  First off it couldn’t find a JDK (which was installed) so I was using bash to update my ~/.bash_profile with the JAVA_HOME variable, but nothing seemed to work.  I fixed that by simply downloading the latest JDK release package and installing it.

Once the JDK errors and my /.bash_profile were fixed I then got the error (this is it in its entirety) “su: Sorry” … at that point I threw in the towel.  I’ve got better things to do that battle with this crap.  I abandoned the plan to push install the agent as I simply couldn’t figure out what was wrong with no information, the error text above was all that appeared in the agents log file as well so its  anyones guess as to what the problem really was.

NOTE: Make sure you can browse to the TeamCity site on your network using the agent machine

So I dialed into my TeamCity server, grabbed the zip installer and did it all manually.  This worked first time (after amending the buildAgent properties file) so what ever the issues were with the push installation I don’t know and frankly I don’t care, JetBrains have some tiding up to do both on this.

Build Steps

Since the build for the iOS application is now happening using Xamarin Studio installed on the Mac the build process is much simpler.  At the moment my unit tests are all run during debug builds only and these occur on a PC running another build agent.  I will revisit this to start executing test runs on the Mac agent to make for a more holistic testing process.

Updating Plist Files

There are a number of ways you can update the plist data on OSX.  Some people seem to suggest using the termainal DEFAULTS command but I found that didn’t work when presented with an iOS style Info.plist.  Chiefly it seemed to be confused by the internal layouts having arrays and a dict element.

By far the simplest way I found was making use of the standard OSX tool PlistBuddy which can deal with our iOS plist data.  So I added a new build step (executed before the compile step) and have configured it to allow me to update it’s internal version numbers and bundle identifiers based on the build configuration.

I normally use a 4 element version number for .NET (major).(minor).(build).(revision) which gives full traceability back to the source control for an individual release.  But if you use this in all cases you’ll have problems as iOS uses the semantic version number format (major).(minor).(build) for the “CFBundleShortVersionString” version number.

But you can also use arbitrary values in the “CFBundleVersion”.  Lots of people seem to ignore the AssemblyInformationalVersionAttribute in .NET but this is a great feature as you can use arbitrary strings as well as the more formal version format when using it in .NET assemblies.

$ /usr/libexec/PlistBuddy -c "Set :CFBundleIndentifier (yourbundleid)" Info.plist
$ /usr/libexec/PlistBuddy -c "Set :CFBundleShortVersionString %build.number%" Info.plist
$ /usr/libexec/PlistBuddy -c "Set :CFBundleVersion %build.number%.%build.vcs.number%" Info.plist

Compile – Obsolete

The build step for compiling the iOS configuration is executed using a CommandLine build runner with a simple command like this:

/Applications/Xamarin\ build "--configuration:Debug|iPhone" /path/to/solution.sln

Archiving Builds – Obsolete

Once the build is complete you’ll want to automatically create your archive. When you come to distribute your application to the App Store or to TestFlight via the XCode you’ll want to use xcarchive. You can use IPA files from the Build step with the Application Loader but as far as I can tell there is no way to upload dSYM files for any symbolicate processes which is a bit crazy (unless I’ve missed something)

Create another build step and use:

/Applications/Xamarin\ -v archive "--configuration:Debug|iPhone" /path/to/solution.sln

This will place an xcarchive file into ~/Library/Developer/XCode/Archives/.  You can then validate this archive via XCode -> Window -> Organizer on the Archives panel.


So, it seems the recent versions of Xamarin Studio and the MDTOOL method of compiling and archiving have become problematic.  I started to see builds failing with some odd errors relating to my Droid project inside the build logs of the iOS builds.  Hugely confusing.  I never got to the bottom of why this was occurring and decided to review the whole process.  Both the compile and archive steps are now combined into one step and one command.  The command I’m now using is:

/Library/Frameworks/Mono.framework/Commands/xbuild /p:Configuration=AppStore /p:Platform=iPhone /p:BuildIpa=true /p:ArchiveOnBuild=true /target:Build /path/to/solution.sln

Obviously you can change any of the configuration or platform parameters to suit your requirements / project setup.

As I work through the next steps of this process I’ll keep this blog post updated – you can read the Android version here.


Source Control – Learning Git

Over the years I’ve used many source control systems.  Most of them seemed pretty simple to use after a short period of time.  However they all have their idiosyncrasies that will occasionally catch you out and some of them have been next to useless at their intended purpose.

The worst of these systems I have had personal experience with was Visual SourceSafe from Microsoft, this shouldn’t be used by anyone, ever.  I’ve seen it do the only thing a source control system should never do – lose your work.  I must admit that all my experiences with Microsoft solutions for this particular task have been poor.  I’ve only had TFS inflicted on me once and that experience was extremely lumpy and I really never want to have it again.  Certainly after reading that at the core of TFS is a re-written Visual Source Safe, shudder.

There are many source control systems to choose from which can make it a bewildering decision to make.  They range in price from free to eye-wateringly expensive for “corporate solutions”.  Source control is source control, it either does it or it doesn’t, you can keep your bells and whistles thank you!  I haven’t had the opportunity to try any of those since I haven’t worked for a massive organisation that was ever prepared to spend that kind of cash – yes they really can be THAT expensive, think six figure sums of cold hard cash, just on source control.  Your product has to be worth millions before you’d even consider them.

Subversion Source Control

For the most time, including my home projects and business source control tasks I have been using Subversion, particularly SVNServer and TortoiseSVN.  Since these projects are a one developer affair the extra functionality of a distributed architecture for collaboration was never a consideration and since I was also already very familiar with the system this was a no-brainer decision.  Install, setup a repository, get on with code.

To be honest SVN still isn’t a terrible choice (unless your name is Linus then you are stupid and ugly by default).  It does the job, it doesn’t get in my way (that often) and since everything is local to my machine it also performs well.  As soon as you have to start going over the network the performance can degrade dramatically though.

Enter Git Source Control

So last year (yes I know I’m very late to the party) I had my first commercial requirement to use Git and I also started using GitHub to host some of my own projects related to my Prism articles (part1, part2) and other bits and bobs.  I think one of the reasons that I had at that point kept Git at arms length was that I hadn’t been given a compelling reason to NOT carry on using Subversion.  I also hadn’t been given a commercial reason to NEED to use Git, I was already busy enough writing code to start changing core elements of my workflow.

Another issue was that during my first commercial use of Git, when I hit an issue the guys who owned the repository also didn’t know how to solve the issues which wasn’t very reassuring.  That just put a flag up for me hammering home that “Git was inherently complicated” – avoid.  I had also struggled to get some of my own stuff up onto GitHub using their own Git client application (which I now don’t use as I don’t think it’s very good to be honest).

There was no denying the rise and rise of GitHub as a platform and the fact that all my fellow developers seemed to be flocking to Git and loving it.  The final straw that led to this blog post was that I really wanted to help out on some open source projects – chiefly MvvmCross.

Now I had to learn Git and boy am I glad I did.

From the initial confusion, which is absolutely related to the way you think about source control and the choice of verbs like Add, Commit and Branch (among others), is that Git thinks very differently about these verbs.  It also does very different things in response to them on the command line.  BUT once you get it, you’ll realise the shortcomings of all the source control systems you’ve used before.

Git solves problems you didn’t even know you had.

The biggest difference that WILL confuse you is going from a traditional centralised repository model to Gits distributed model.  Don’t assume you will just get this fact, I thought I would, I didn’t, it hurt.  The best graphic I’ve seen that encapsulates this is …

source control git distributed model

Gits distributed model

The basic gist of this is that there is no one master repository that you are used to interacting with.  There are many but most importantly you have both a local “working copy” and a “local master repository” or “staging area” which you commit to.  You can link to an arbitrary number of “remotes” (other repositories) and push and pull from any of them as if they were all master repositories (in centralised parlance).  It couldn’t be more different from something like Subversion.

In the graphic above the origin can be thought of as a traditional central repository but you can have lots of these if you want.  More importantly you can also pull changes from anyone else and they can pull (update in SVN terms) from you.  All very cool, and very powerful.

Switching to Git Source Control

To get started with Git you first need to install Git itself (none of the gui downloads include Git).  A good GUI client (none of them are perfect) is SourceTree from Atlassian.  They also provide great resources here.  There is also a completely free Pro Git book.

I’m not going to go into any more details about that here as this is what the videos and links are for.  All these resources and the main video above will give you some context on Git.  Once you’ve watched that video this is the next one you should watch to get to grips with the nuts and bolts of Git.


Package Dependencies & Deploying .NET

I recently had a requirement to completely package up a .NET application so that it was extremely portable beyond it’s initial installation. By bundling all the dependencies into the main executable package you can move the application around with minimal fuss once installed.  The goal was to keep the application as independent as possible.

The application is actually fairly simple with only a handful of required library files to package.  Once all the dependencies had been packaged the entire application executable is still under 600k.  This process detailed below also enables the package to contain non .NET assemblies, like C++ components and other files.  What’s really good is by default the dependencies are never extracted onto the file system which is ideal.

Initially this might sound like a complicated process. However … all you need is one NuGet package Fody/Costura (Source).  That’s it!  Just install the package and you’re done!

The NuGet Package Solution

make sure that you have your start up project selected in the package manager console and execute this command:

Install-Package Costura.Fody

You don’t even have to configure anything.  Just build …

The only thing that might need doing in order to make sure your package only contains exactly what’s needed you can add a new set of targets to your *.csproj file.

When you install the package it will automatically add a new import:

<Import Project="packages\Fody.1.26.1\build\Fody.targets" Condition="Exists('packages\Fody.1.26.1\build\Fody.targets')" />

Then to make sure you get the package your really want you can add:

<Target AfterTargets="AfterBuild;NonWinFodyTarget" Name="CleanReferenceCopyLocalPaths">
  <Delete Files="@(ReferenceCopyLocalPaths->'$(OutDir)%(DestinationSubDirectory)%(Filename)%(Extension)')" />

Done. Very nice and a must know trick.  The only thing I want to now tackle is bundling in the base configuration file.


TeamCity Backup Location Outside TeamCity Data Directory

By default TeamCity doesn’t allow you to place backup files outside the default data directory (internally known as the TeamCityData).  If you want to have these files dumped into a location that is already part of your automated backup processes you need to make some configuration changes to TeamCity.

  1. Locate your TeamCityData directory and navigate to the config directory
  2. Create a new file in here called
  3. Open the file and paste in teamcity.maintenance.backup.allowAnyFilePath=true
  4. Save the file
  5. In the TeamCity UI Go to Administration -> Diagnostics -> Internal Properties
  6. Confirm the setting is in top section called “File properties (from G:\TeamCityData\config\”
  7. Navigate to Backup page and set your location as desired

Xamarin Android Automated CI Publishing Using MSBuild & JetBrains TeamCity

I love automating processes and making things consistently repeatable.  Continuous integration is your friend, yes it can take time to setup but if you take that effort hit upfront it will pay off in an immeasurable way through your development process and during the entire lifetime of the your development.

For my continuous integration I use JetBrains TeamCity.  I love this CI system, granted I’ve not used them all but comparing them to TFS and Jenkins it is like an oasis of calm in comparison and without doubt the cleanest UX design by a very long margin.  Anyway … I digress.  If you want to automate the process of producing your ready to distribute Xamarin .apk Android application packages, this is the basic process you need to follow:

Creating Your Keystore

The first step is to create your own personal keystore that will contain the information used to digitally sign your Android package files.  You can do this with the following command:

"C:\Program Files (x86)\Java\jre1.8.0_45\bin\keytool.exe" -genkey -v -keystore myandroid.keystore" -alias myaliasdroidpub -keyalg RSA -keysize 2048 -validity 30000

Before you run this command make a note a few parameters first.  I did all of this in a batch file so I can refer to it later, this does include some sensitive data so beware what you do with this file.  The 30000 at the end of the command denote the length of validity of the certificates and Google require this to be past 2033.  You’ll need the following info:

REM — Password – <yourpassword>
REM — name —– <yourname>
REM — OU ——- <organisationunit> eg: JamSoft
REM — Orgname — <organisationame> eg: JamSoft
REM — Local —- <locality> eg: Frome
REM — State —- <state> eg: Somerset
REM — Country — <2lettercountrycode> eg: UK

Visual Studio Android Project

Now that we have our Java keystore ready and prepped for use we can look at the Visual Studio project.  In order to make this automated in the build system we need to configure the project to use our keystore credentials.  This is spreading sensitive information around but since our keystore is probably going to be checked into our repository it’s most likely going to spend it’s life being treated in a sensitive fashion anyway, so no real concerns.  I personally make lots of backups of things in large .RAR files and store these in various places but these are also password protected and treated as confidential.

In Visual Studio edit the Android application .csproj file and add another PropertyGroup element as per the code below:

  <PropertyGroup Condition="'$(Configuration)' == 'Release'">

Now our .csproj file knows how to use our keystore unattended. We can tie into the Xamarin build process from within our automated builds and produce the base Android package.

You can test that this is working using the following command:

msbuild.exe HelloWorld.csproj /p:Configuration=Release /t:PackageForAndroid

This will execute MSBuild against your Android project file and you’ll now find your .apk file in your bin\Release directory.

Signing The Package

Now that we have our packaged application we can apply the signing processes.  To sign the package created in the previous step we need to execute the following command:

"C:\Program Files (x86)\Java\jdk1.7.0_71\bin\jarsigner.exe" -verbose -sigalg SHA1withRSA -digestalg SHA1 -keystore myandroid.keystore -storepass yourpassword -keypass yourpassword -signedjar \bin\Release\packagename-signed.apk \bin\Release\packagename.apk myaliasdroidpub

This package is now digitally signed using your certificate from the keystore we made earlier. Personally I still have a couple of questions around why this needs to be done again considering that the details for this keystore are in the csproj file but I’m still waiting on a response from Xamarin on this.

ZipAligning the Signed Android Package

Now that we have a signed package we can zip align this package and then publish this as an artifact of our TeamCity build process.  This command makes use of the Android SDK zipalign.exe program.  You’ll have to find where this is on your machine as there are many potential locations.  The command you need will look something like this:

“C:\Users\<name>\AppData\Local\Android\android-sdk\build-tools\<version>\zipalign.exe” -f -v 4 packagename-signed.apk packagename-zipaligned.apk

You don’t need to follow this naming convention used here this is just for illustration purposes.

You can now set this as a published artifact of your TeamCity build configuration.  Yay!

MSBuild Project

To kick off the process in your MSBuild project you can add TeamCity Step that executes an additional Target in your build script like this:

  <Target Name="PackageAndroidApk">
    <MSBuild Projects="$(MSBuildProjectDirectory)\..\src\path\MyProject.Droid.csproj" Targets="PackageForAndroid" Properties="Configuration=Release">

This target will create the base APK package that you can sign and zipalign.

Once you have all these commands working you can add all the required build steps.


And voila!  Automated.  Nice 🙂


TeamCity CI Server Login Fails at Boot Up

If you’re running an instance of TeamCity on your local dev box, and who isn’t eh! haha, oh for more machines.  It’s highly likely that over the last two major versions (8 and now 9) that you are often, if not always, presented with something like this after boot up:

Cannot open database “TeamCity” requested by the login. The login failed. ClientConnectionId:f17b1159-b183-4241-bc98-c119d1767b49
SQL exception: Cannot open database “TeamCity” requested by the login. The login failed. ClientConnectionId:f17b1159-b183-4241-bc98-c119d1767b49

The issue is that TeamCity tried to connect to it’s database before SQL has finished initialising.  So obviously the login is going to fail.  I see this happening on every boot, a simply flick of the switch to restart the service and we’re off.

It’s very annoying and I’ve raised a bug report twice with JetBrains, once for version 8 and once for version 9.0.4 (the very latest at the time of writing).  Might seem like a trivial problem but there are a few things that directly contribute to this issue being out of their hands.  The driver sqljdbc_4.0.2206.100_enu sqljdbc4.jar is authored by Microsoft and seem to contain a bug in that it doesn’t always report the correct state.  Meaning JetBrains cannot reply on it’s response in order to Do The Right Thing™

SO … time to break out the batch file …

The (Hopefully Temporary) Solution

Make a batch file with this in it:

net stop TeamCity && net start TeamCity

And then setup a scheduled task to run 5 minutes after bootup and runas admin, sorted.


Restoring YouTrack Backups

I love JetBrains software, I really do.  How I’d get by without ReSharper is anyone’s guess but they are not issue free by any stretch.  There also seems to be some very odd issues with their documentation a lot of the time as well.  A real shame since considering how nice their products are and how easy they are to get going with the processes of migrating to new version can be a bit hit and miss – if you follow their docs.

A case in point is YouTrack.  I’ve just had to reinstall my server that runs my local instance of YouTrack.  Now an issue tracker for a developer is insanely important.  It will more often than not contain lots of hard earned intelligence not to mention priorities for work and ways for lots of disparate team members to communicate.

I was running YouTrack 5.2.5 and I had that configured to do automatic backups.  So looking at the JetBrains instructions on how to restore a backup looks remarkably straight forward (it is by the way just not how they describe!):

Skip to end of metadata

Important note
For successful restoring, version of the backed up database should be the same or older than the version of YouTrack, to which you restore this database.

To restore a database from a backup, proceed as follows:

  1. Download a backup tar.gz file via web UI from the Administration > Database Backup page or locate a needed backup file directly in the specified backup folder on the YouTrack server machine.
  2. Stop YouTrack server instance.
  3. Clean all content of the database location directory. By default, the database is located in the teamsysdata folder in the home directory of a user account, which starts YouTrack. You can change default database location, for more details please refer to the Changing Database Location page.
  4. Extract the downloaded backup file to your database location.
  5. Start YouTrack.

The instructions I’m about to write below work for 5.x and 6.x (I’ve tested these on both versions and they work).

Firstly the backups I have generated from YouTrack are not tarbals, they are .zip files, this is configurable in YouTrack but I cannot find anything on the JetBrains site that details the process for zips so I can only assume that either configuration will result in a file that contains the same data.  Anyway, I digress.

So mistake number one, no mention of zip files.

The Solution

In step three is says – Clean ALL contents of the database directory.  DO NOT DO THIS!!  If you delete all the files in there as suggested and then dump the contents of your backup ZIP into the directory, it doesn’t work.  At least it didn’t for me.  Also the teamsysdata directory they mention in this article referrd specifically to YouTrack 5.x yet this article was written in 2012 and is also part of the 6.x documentation.  YouTrack 6.x is vastly different to 5.x and does not use the same structure for the database at all.

What I do (and always works for me) is:

  1. Logout of YouTrack
  2. Stop the YouTrack service
  3. Remove the existing 00000000000.xd file from your YouTrack database directory (your file name maybe slightly different)
  4. Copy over your backup *.xd files into the database location – mine were called (00000010g00.xd, 00000010o00.xd, 00000011000.xd)
  5. Copy over your blob directory into the existing blob directory
  6. Leave all the other files in place and untouched
  7. Restart the YouTrack service
  8. Login under the root account
  9. You should now see your restored issues

Personally I don’t understand how a company that makes such great things as ReSharper can make such glaring issues in their documentation.  Certainly when it refers to something as important as restoring backups of your hard earned intelligence.

Anyway, I hope this helps and clears up some confusion surrounding this issue.  JetBrain, if you are listening please sort this out.


Nuget Strikes Again

Honestly, for tool that’s supposed to make life easier and more robust I find myself battling with it often.

I’ve been upgrading some of the packages in one of my projects and ran into this pretty error:

The schema version of 'Microsoft.Bcl' is incompatible with version 2.1.31002.9028 of NuGet.
The schema version of 'Microsoft.Bcl.Build' is incompatible with version 2.1.31002.9028 of NuGet.
The schema version of 'Microsoft.Net.Http' is incompatible with version 2.1.31002.9028 of NuGet.
etc ...

Er …

OK, so initially I thought I’ll update the version of Nuget that TeamCity is running. Updated that, reran the build … same problem. Odd I thought … did some Googling around and found out that you have the MANUALLY … yes you read that right MANUALLY update Nuget … What the very fuck? This tool gets worse in my opinion the more I learn about it.

SO, go to you .nuget directory in your solution and run this:

nuget.exe update -self

So just what the hell is happening when VS tells me Nuget Package Manager needs an update? … yet my nuget.exes in the solution where still launguishing around at version 2.1 … honest, this is shit!

Now I’ve gotten my Debug build to work, but I keep getting these errors on my release build:

The build restored NuGet packages. Build the project again to include these packages in the build.


This is EXACTLY what Nuget is supposed to bloody do! That is one of it’s core functions … run the build again? Can you tell that I’m seriously pissed off with this tool right now?


CI Process – Why?

Why is having a CI process good?

There are so many reasons why CI is good. I use it on everything I do. For my recent web based project it’s simple a must have. Forcing you to take an application and pipe it through a series of standardised processes is invaluable. In regards to web applications I go through the full monty. The process I use is:

  • Dev & Debug in Visual Studio
  • run unit tests locally
  • commit to SVN
  • TeamCity Debug Build (Unit Tests included)
  • TeamCity Debug Build with ReSharper Code Inspections and FxCop
  • TeamCity Release Build with NuGet Artifacts
  • Octopus Deploy to local IIS for Staging

I do this even though it’s only me on the project. A web application should be sturdy enough to make it through this process and flourish at the other end of the process. It helps keep things robust, transferable, repeatable, tracked and traceable. All good things for software!

Obviously the quality and coverage of your unit tests is a key element in the process but the pipeline described above is also key to ensuring that an engineered process is always taking place, the randomness is removed entirely. Ideally the build process should be occuring on separate machine as the environment it’s being integrated in is same environment it’s designed in which isn’t ideal but arguably less troublesome than doing it with a desktop application complete with windows installer.

The next step to be added to this process is something along the lines of Selenium automated tests to run against the website itself.


MVC Deployment – JavaScript – ReferenceError is not defined …

This had me stumped for a moment.

All the Javascript was there (linkable from bundle links in the page source), all the files were there, everything worked in dev (doesn’t it always?).

Anyway, if you’re seeing this error in your deployed applications there are a couple of steps to take.

Firstly you should download and install the Visual Studio tool JSLint.VS2012. Then set this up to show warnings and don’t set it up to fail the build. JSLint is not kind and very strict about things (your adherence to BP is your choice but recommended for sure!)

So, you deploy your app and BANG all your lovely JS is twatted. Never fear … pop over to your web apps Global.asax file and in the Application_Start method include this:

BundleTable.EnableOptimizations = true;

So now with this setting set you can launch your app in debug mode with all the Optimizations forced on to test minification and more closely model your testing to the deployed version.

Once you have that working and are seeing the problems more clearly you can start to work through the potential issues using the reports from your new JSLint VS plugin to fix the syntax and other formatting issues.


Build Process – TeamCity, NUnit, dotCover & Octopus Deploy

I’ve blogged about TC a bit in the past but I’ve just setup a whole new build process for my current project.

Added into the mix these days is dotCover and Octopus deploy. Have to say that I’m seriously impressed with the simplicity of this flow to release. Still a few elements of Octopus Deploy to get my head around (deceptively simple tool!). Anyway, as ever getting some of the reports configured took a while …

Get your coverage filters right in TeamCity:


If for instance your project is called JamSoft and all your application dlls are called JamSoft.blah or JamSoft.blah.blah then somestring should be “JamSoft” and of course you’ve suffixed all your test libraries with .Test haven’t you … 🙂

Unless you already have a tool you’re using I found the TeamCity internal instance of dotCover a good solution. I tried to get NCover 1.5.8 working and just gave up in the end, it’s old and getting problematic as far as I’m converned and since the TC internal solution is available use it. Saves some potential headaches.

I also had a few teething problems getting NUnit working this time around. I was using an installed version of NUnit 2.6.2. However on first build in TeamCity it could completely the initial compile process as it couldn’t find the dlls. I ended up switching over to using a NuGet NUnit package and then the compilation steps were fine. Bit odd since I was running the TC builds on my development machine so NUnit was definitely available.

As soon as I’ve got a couple of things ironed out with my use of Octopus Deploy I’ll no doubt blog about that as well, however for now I’m total noob so …


Batch Files Really Are Your Friends … Svn Backups

Just finished a long coding session … ???


set SvnAdminPath=C:Program Files (x86)VisualSVN Serverbinsvnadmin.exe

“%SvnAdminPath%” dump –deltas P:Repositories<repo_name> | “<path>gzip.exe” > <backup_path>SourceSvn.dump

echo Copying dump to h:backup
xcopy /Y <backup_path>SourceSvn.dump H:Backups

echo Copy dump to o:backup
xcopy /Y <backup_path>SourceSvn.dump O:Backup




Subversion Global Ignore Pattern

I’m slowly building up this default Subversion global ignore pattern for my Windows & Xamarin Development on Windows & OSX.  Nothing too fancy but I’m going to keep this one up to date so others may find this useful as a default pattern or as a base for their specific requirements.

*.o *.lo *.la *.al .libs *.so *.so.[0-9]* *.a *.pyc *.pyo __pycache__ *.rej *~ #*# .#* .*.swp .DS_Store *bin *obj *_ReSharper.* *.user *.suo *.pdb *.svn *_dotCover.* *_TeamCity.* *.vs *.psess *.vsp *.vspx *.userprefs

MSBuild, TeamCity & Versioning

I have only recently start using TeamCity at home for my own major project SampleSort.  It is a pretty complex application all in all and the build process is also complex due to the modular nature of its internal structure.  Now this is all controlled by TeamCity it makes proper versioning a real proposition rather than incrementing a version number by hand.  Even though it is possible to version assemblies using the 1.0.* pattern in the version attributes in the AssemblyInfo.cs file this isn’t the best solution for my particular scenario.

Now if you are using source control it is useful to have the version number of the source incorporated into the overall application version.

Using TeamCity I have manually set the Major and Minor so that I have some control over it but the build number comes from TeamCity and the revision is taken from my Subversion repository.  To do this you need to use a pattern like this in the ‘Build Number Format’ setting on the General Settings page of your build configuration:


The 0.6 section is the portion I retain control over, the {0} is filled in by TeamCity an the {build.vcs.number} is the Subversion repository number.

Now to make use of this in the build process you need to access this information and update the AssemblyInfo.cs files BEFORE the build system performs the Release build. I looked at the community tasks for the AssemblyInfoTask but was a little put off.  For one thing it creates the file rather than edits the existing one, which is a 20+ project scenario would result in a fairly hefty build script.  All I wanted to so was update the existing AssemblyInfo files.  So, my MSBuild target is as follows:


<Target Name="VersionAssemblies">
<Attrib Files="@(AssemblyInfoFiles)" Normal="true"/>
<FileUpdate Files="@(AssemblyInfoFiles)"
<FileUpdate Files="@(AssemblyInfoFiles)"

So here we are using RegEx to perform a find and replace on the AssemblyVersion and AssemblyFileVersion attributes within the AssemblyInfo.cs files. Nice!


NCover MSBuild Task

After having configured my MSBuild scripts for use in TeamCity I found this rather nifty page in the NCoverExplorer UI.  You basically configure the NCover report via the UI.  Open up the NCoverExplorer application and click File -> Run NCover.  You are then presented with the Runner dialog box where you set various parameters in the UI.  You can then press various buttons and it will write your Nant/MSBuild or command line for you!  Very neat …


MSBuild – Editing Xml Scripts

Editing MSBuild Xml files can be a bit of a chore. For general everyday file editing I tend to use NotePad++ which is an awesome file editor and viewer. However, you can use Visual Studio to edit your MSBuild files and at the same time get Intellisense!

Open up Visual Studio and go to the Tools -> Options and go to the Text Editor options then you can associate file extensions to text editors. For MSBuild files we need to associate things like “proj” to “XML Editor with Encoding”. Now we also have intellisense to help write the script.

Now Visual Studio will be able to load the .proj MSBuild script into the correct editor and you can obviously associate *.proj files with Visual Studio to launch into them from Windows Explorer.


CI – TeamCity

I have finally completed the set-up and configuration of my new continuous integration system for SampleSort.  I have to say that I’m seriously impressed with TeamCity.  The installation and configuration was straight-forward and has been rock solid.

One thing I would say is that after installation I’d recommending switching to SQL as the database back end immediately.  I started off using the internal database and found the migration tool very awkward so I’m now missing some data from the initial build runs but I can live with that.

I have configured two build configurations for my SampleSort project.  The first build configuration works in Debug mode and runs the following reports:

This debug build configuration is triggered after a Subversion source code commit and is set-up to fail the entire build if there are any tests that do not pass.

The second build configuration works in Release mode and performs a simple Release build of the same source code on the event of a successful Debug build.  The last step in this Release build configuration is to then run an Inno Setup compiler script to package the release mode application into the final application installer.

At the moment this last task is simply an MSBuild Exec command so it all happens outside of the TeamCity scope if you will, this just means that any reporting isn’t included in the TeamCity results.

So far getting this all set-up and configured has actually been a very time consuming task.  There have been many points of confusion along the way so I’m  going to add another blog item soon that goes into a bit more detail of how I went about getting each of these build tasks set-up.  I still have a few niggles that need to be ironed out of my script that relate to MSBuild batching, basically during the test phase there are actually multiple passes of the tests in order to produce the test reports.  This does have a knock on effect of making each build pass a little more time consuming than it need be but also in the great scheme of things this isn’t a huge problem.

I have to say that to have all of this (TeamCity, xUnit, NCover, NCoverExplorer) for free really is inspiring in many ways.  A lot of blood, sweat and tears went into making all of these tools and I’m now benefitting (and my users) with zero financial outlay, I tip my hat to the developers!  I also have to thank Patrick for the free NDepend licence!!


Continuous Integration – TeamCity

Over the last 6 months I have tried a few CI systems for my own projects. I’ve been looking for something that will formalise the “Clean, Build, Test & Deploy” process so that it’s more predictable, repeatable and almost more importantly – recorded. Nothing major really, I’m not looking to CI to merge the output of multiple streams of development or some amazingly intricate process that itself requires pages of documentation.

I have tried CruiseControl.NET and the more recent offering from ThoughtWorks (an update to CC.NET) Cruise, however these are great systems but can be a bit of a nightmare to set-up I found, at least in light of my particular intended use.I gave up on both of these systems, I found them to be too fiddly to configure.

Anyway, my friend Lee recently got in touch with a very favourable review of JetBrains TeamCity. Since I really wanted to have a CI system in my workflow I decided to take a closer look at TeamCity.

I downloaded the TeamCity Windows installer exe and started the installation process, all seemed to be going very well. Once it had completed I was presented with the page to create an Administrators account and I was then promptly taken to the Projects page. To cut a long story short it appeared that the installation process hadn’t been entirely successful, couple that with an unfamiliar system and I was pretty confused.

There were no Build Agents installed and running on the system and no Windows Services had been installed. On top of that the ‘evaluation’ database seemed to have been installed incorrectly as it had not stored my Administration account which meant that I couldn’t log back in! So, all in all this installation was pretty broken in reality. I was a bit fed up at this point and decided to un-install the software and just get on with some development work.

Anyway, that was on Thursday … on Friday I decided to have another go, went through exactly the same process and low and behold, the installation actually seemed to work. Within about 2-3 hours I had an entire MSBuild process fully configured and running being triggered by Subversion check-ins.

My build process runs right through from cleaning the checkout location, updating the source code from Subversion, running my xUnit tests via Gallio (with reporting). All good! The only thing left to incorporate is once the Build / Test targets have completed successfully is to then run a ‘Package’ target that will produce the final installer executable. However, I may keep this as a separate project, not sure yet.

The upshot is that I can highly recommended TeamCity. Considering all my attempts at getting CI systems working TeamCity is certainly the easiest to get up and running with, I use CC.NET and Cruise at work so it will be interesting to see how they compare once I’ve had a bit more time with TeamCity.

I’m now a student of MSBuild, any good links appreciated!!