.NET 4.0-Platform Update 1

Today saw the release of what is, effectively, .NET 4.1 – although for some rather odd reason Microsoft have decided to name it slightly differently --- .NET 4.0 Platform Update 1. Now there is a mouthful (and no doubt going to be highly confusing to end users when you ask them to install the .NET 4.0 Platform Update 1 runtime …)

So what’s new? Well, if you are not using Windows Workflow Foundation, it seems you might as well skip this update – as that’s what the changes are in. But then again, there are some very interesting changes here, with the addition of state machine workflows (as well as SQL backed persistence, which is supported in Azure).

Although this update is not really going to apply to many general .NET developers, what annoys me is the naming. And it seems I’m not alone. Why on earth someone had the bright idea to come up with this insane name I really don’t know. And to release it as three packages too.

I wonder, are we going to see the demise of the good old major minor release build style version numbers for something more freehand ? If we do, I think its a step in the wrong direction, and walking towards creating a versioning / distribution hell …

Team Foundation Server 2010-Process Template Editor

I can almost guarantee that if you use TFS, you will need to edit a process template sooner or later; the default forms that TFS provides, although good, always need tweaked to fit how your team works.

I even find the EMC Scrum pack needs tweaked at times (I mean, why is there no assigned to for a bug??).

So, the easiest way to do this is to ensure you have the Team Foundation Power Tools installed, fire up Visual Studio them click Tools, and select Process Editor – then you get to choose what you want to edit!


The most common one I end up editing are Work Item Types – and specifically, I tend to cheat and edit them through this tool on the server.

Now, be sure to abide by all the warnings when editing process templates. These changes kick in immediately, and effect EVERYONE on the dev team using this project. You have been warned.

Also remember to export any modifications and re-import them on other project collections that use the same template for consistency.

Cloud Computing-Too risky?

Amazon recently posted their response to the outage that hammered their EC2 platform lately. It would seem that the outage itself was triggered by a piece of network maintenance that was not carried out properly, which in turn triggered a rather catastrophic chain of events within the custom Amazon systems. Ultimately, it resulted in data loss, as well as down time, for many customers – such as Heroko who posted their own post-mortem of the incident here.

Microsoft Azure also suffered problems recently – with parts of their system becoming unavailable. The first in March was blamed on an OS upgrade that went awry, then in April there was an Azure Storage outage – for which I’ve not actually seen any real detail on the cause (if anyone has a link, please point me to it – I’d love to know what happened). However, I think the stark contrast between these two vendors is the transparency and information given – both at the time and after the fact.

Amazon have gone the whole hog, totally admitting the fault, identifying exactly (in full Technicolor) the issues that occurred and have resolved themselves to – publically – fix it. And they have issued a decent amount of compute time refund. Microsoft? Well, I’ve not heard of any refunds – even partial ones – for the outages that occurred on their platform. I’ve also not heard of any refunds related to outages on another of their cloud platforms – Business Productivity Online Suite – either, which has had it own problems of late. So is using cloud technology too risk? In a  nutshell, no, as long as you are sensible. I can’t say that I would advocate putting everything in the cloud unless its totally stateless and can operate if any SQL instances etc disappear. If you need to store state, or anything really sensitive, I still prefer the hybrid model, but I guess that because they need to do more to convince me that they are as secure as they proclaim to me. The biggest fault with people using clouds to date and suffering outages is quite simply education. They have put applications up into the cloud and expect them to be highly available. That’s not the case. Unfortunately you still need to understand the requirements of highly available design, and be sure to implement them – including setting your application up in different zones / regions – and ideally, different geographical locations! If you don't, all you really are doing is running a small cluster after all. I know that many people will be screaming about the EC2 outage in particular where this was caused by human error. But I’d love to see them do better in their own data centre. Human error occurs everywhere, but where do you think the resources (i.e. skills AND money) are to mitigate them better? On premise with yourself, or out in a cloud?

Team Foundation Server-E-Mail Alerts

I love things that remind me to do things. I’m a forgetful person, and I need prompts. So I guess that's why I’m a big fan of e-mail alerts, and one of the first things I do in TFS is configure them.

The only annoyance I’ve ever hit with TFS alerts is the rather strange absence of ability to setup the authentication for the SMTP server used for the alerts – I like to use an external server, as for a lot of the jobs I’m involved in, the users are very widely geographically dispersed, and all on different providers.

While you can correct this shortcoming by editing the TFS web services config file (found at C:\Program Files\Microsoft Team Foundation Server 2010\Application Tier\Web Services\web.config), I don’t like this approach, as I feel that it is risky – you never know if something will overwrite this file, especially a service pack etc.

So, what do you do?

Simple, install the SMTP Server Feature of IIS on the server, and run a locally restricted SMTP Service that redirects to your smart host.


Once you have installed the SMTP Server, its important that you secure it – ideally you want to set the only machine granted relay permissions as the local machine ( Then go to the delivery tab, and click Advanced. Specify the details of your external smart host – if you need to provide authentication, you will find the relevant options under the Outbound Security button.

Then, open up Team Foundation Server Administration Console, click on Application Tier, and then Alert Settings (over on the right). Fill in the boxes, and away you go.


Oh, and one last thing – make sure the SMTP Service is running!

The final step is to use the excellent Alerts Explorer tool that is in the TFS PowerTools pack to setup your alerts.


Team Foundation Server-Automated Backups

One of the things that never ceases to make me smile is the number of companies running Microsoft’s Team Foundation Server software … who don’t back it up.

For those that don’t know, TFS can be looked at as a central store for pretty much all the work that goes on inside a software company. Neglecting to back it up is opening yourself to disaster.

As the TFS Databases are nothing more than SQL Databases, you can back them up in the normal SQL way, or use a tool (there are multiple databases, and you have to get them all at the same time and in the same state – not always easy to achieve). My favourite tool of choice for handling these backups is actually part of the Microsoft Team Foundation Server PowerTools, and integrates neatly into the TFS Administration Console.

The first step after installing the tools is to Create Your Backup Plan.


Some things that you will need before you start are:

* A network location where you want your backups to go
* An idea how long you want your backup retention (defaults to 30 days)
* An idea of how you want to schedule your backups – the default nightly runs at 2am local time

Now, my TFS Server doubles up as the main fileserver, so I cheated and entered a local path in the Network Backup Path. (These in turn are synced off to a remote device nightly). This, although being accepted, failed the Readiness check – as its not a network path.

One strange gotcha, I chose to run Full and Transactional backups – leaving Differential off, and you have to uncheck any checked day selection boxes before you can continue.

The other thing that caught me out was the Grant Backup Permissions and Backup Tasks Verification steps were failing, saying that my own account did not have suitable rights for the backup location (strange, as I’m an Admin, and I have full writes to both the NTFS folder and the share). After checking the TFS and SQL Server logs, it was a problem that my target share had a space in it. Putting quotes round it doesn't help either, just causes something else to fail.

And the third, and final thing? Don’t use Local_System account. Remember to setup your own, restricted where possible, account for all services.


Mvc Scaffolding - Part two

I need to start tonight's blog post with an apology … in last nights post, I neglected to explain what NuGet is – I kind of took it for granted that you developers out there would know.

NuGet is a Microsoft spin off that is intended to make the introduction of new developer frameworks to Visual Studio and your projects easier – removing the need for you to remember all the assembly dependencies. All you do is open the Package Manager Console (View, Other Windows, Package Manager Console) and key in a few PowerShell commands – and NuGet will pull down the dependencies onto your machine, and update your open project. Simples.

So, lets get rocking. We have our sample project (MVC 3 Razor) up and running. For this sample we are going to build a very simple prototype around an order, and the information it would hold.

We need the following classes:


public class Address
    public Guid AddressID { get; set; }

    public string AddressLine1 { get; set; }
    public string AddressLine2 { get; set; }
    public string AddressLine3 { get; set; }
    public string City { get; set; }
    public string County { get; set; }
    public string PostCode { get; set; }
    public string Country { get; set; }


public class OrderItem
    public Guid OrderItemID { get; set; }

    public int Quantity { get; set; }
    public string Description { get; set; }
    public decimal Price { get; set; }


public class Order
    public Guid OrderID { get; set; }

    public Address DeliveryAddress { get; set; }
    public Address InvoiceAddress { get; set; }

    public DateTime OrderDatae { get; set; }

    public List<OrderItem> OrderItems { get; set; }

You’ll probably notice the [Key] decoration in that lot – these define key indexes, which Entity Framework Code First needs to setup the necessary indexes for CRUD operations on the database. You will need to add a using statement to each of your class files for the namespace of System.ComponentModel.DataAnnotations.

So – that defines an order which has two address (one for the delivery and one for the invoices), and a list of items.

Lets go ahead and create a controller, and the CRUD pages for the Order. Open up Package Manager Console, and type in

scaffold controller Order

You will see that the MvcScaffolding framework gets busy and creates the relevant update pages – but most importantly it creates a Context, also within the Models folder. In this example its called ScaffoldingExampleContext.cs. This brings us to the first couple of annoyances with Scaffolding, and EFCodeFirst.

- The database context gets placed in the same project – no easy separation here to give you a distinct data access layer.

- The context connection string expects you to have SQLExpress installed, and operating as a named instance of “SQLExpress”.

On my machines, the latter is not the case, so we need to pop into web.config and add a ConnectionString into the configuration block:

  <add name="ScaffoldingExampleContext" connectionString="data source=.;Integrated Security=SSPI;Database=ExampleData" providerName="System.Data.SqlClient"/>

Fire up your project, and got to /Orders/ on the site that launches – you will be presented with a very simple CRUD interface for the Orders.

You would need to run the same Mvc Scaffolding command (scaffold controller <classname>) against each of the classes, and then you have a little but of manual plumbing to do – such as navigation, and in this case, the actual selection of things like Address and Items.

Scaffolding, at this time, can not handle complex objects such as other classes or list definitions to give you a total coverage CRUD interface, but it can generate most of the simple stuff that makes up a lot of the work involved in most prototypes.


Mvc Scaffolding - Part one

I love learning new things connected to software development – and anything that has the potential to save me time is always high up on the list of things to investigate. After all, time is money – either for yourself when you are a contractor, or your employer if you are an employee!

I’m sure you all have been using MVC (Model View Controller) structures for a while now, but I’ve only recently started looking at the “Scaffolding” approaches that are out there – these are quick, simple ways of building, essentially, your prototype application based only on the base models. So you create one class, and the Scaffolding framework builds the views, and the controller for you. Job done. Oh, and it also sorts out the persistence (read: database) for your models too.

Sound like its too good to be true?

Well, I’m not going to say its perfect, in fact, it seems to have some insanely annoying “defaults” – some of which I’m going to introduce you to while we go through this introduction to Scaffolding over the next few days / posts.

First off we need to install the scaffolding stuff – the easiest way to do this is to use NuGet in Visual Studio 2010.

Create a new MVC 3 Razor project (my preference!).

Open up the Package Manager Console, and type in:

Install-package mvcscaffolding

Sit back and wait while the packages are installed. A reference to the scaffolding libraries will be added to your project automatically.

Now, I’ll go into actually using the scaffolding library in my next post, but first, I want to bring your attention to, what in my opinion, is one of the most annoying things about this library.

It uses Entity Framework Code First (EF Code First). Basically this is a wrapper for Entity Framework that lets you write your models and have it handle the generation of the database schema around Entity Framework. So far so good. But it looks for an instance of SQLExpress. Total pain in the ass in my case, as I don’t have it installed as “SQLExpress” … so it fails out of the box. Wonderful.  There isn’t a nice way that I’ve found to change this across the board, so you are down to changing it per project. Again, I’ll cover this on my next post, as you really need to have at least one Model “scaffolded” first so you know what context names etc will be used.

A year of contracting ..

It’s just under a year since I decided to return to the wonderfully unstable world of contracting in Software Development.

And I have to say its been an interesting time – certainly had it’s up and downs …

But, I’ve gotten to work on some truly interesting projects, be involved with some inspirational start-ups and extended my skills.

One thing that caught my attention early on my re-entry to contracting was the telecoms sector – this has always been something I’ve been interested in, having worked a little with Cisco’s in the past, I figured it was time to push forward with it. So I have.

Over the last year I’ve attained my Avaya certification, become an Avaya Technical Partner and I keep pushing – extending my skills into other makes, including Cisco, Mitel and Nortel. I’m loving the variety that this industry brings to the table, and it’s a bit of fresh air from “normal” lines of business applications.

Most scarily, I haven’t had to advertise for work. I’m starting to think that I might, as new projects are starting to thin out (damn economy), but so far things have been good. But that isn’t to say that I’m not keeping my eye on the job market, looking for that ever “perfect” position. You know, that fabled one that is a joy to work at, has free coffee and means you still have a weekend? Ok, I know, maybe a stretch there – I’d give up SOME of the weekend I guess Smile

I wonder what he next 12 months will hold.

Software Development Buzzwords-Scrum and Agile

Everywhere you look now it seems the software industry is singing the praises of Agile development practices – and in particular Scrum.

The actual process was identified back in the late 1980’s although was only introduced to software development in the 90’s.

So what exactly is scrum? Well, it means different things to different people, and its an approach – not a hard and fast set of rules, or an exact process. Essentially, its all about breaking things down into small segments that can be done in a short time frame of at most 30 days (also known as a Sprint). At the end of each sprint you have a new version that you can, if you so desire, release.

Many companies look at Scrum as being a silver bullet to solve all the development best practice woes. It’s not. If you have problems with your best practice (for example, source control, bug tracking and such) then Scrum will NOT solve them de facto for you – and I can’t help but get annoyed when people imply that it can.

Or even, when people imply that there is only ever one way to “do” Scrum. I fear that these days, the words Agile and Scrum have become buzzwords for the development industry – much like Extreme Programming (XP) was a few years ago. Will scrum head the same way? I don’t think so – most development houses have probably been running a scrum style approach without even realising it …

Small Dev Team-Tips for work scheduling

I’ve just been planning some work for fellow developers, and thought I’d throw some thoughts up on what I feel creates a good approach for working as a team.

Know the “critical path”
Before you get started, try and identity what problems you will encounter (easier said that done), but be sure to have a clear understanding of how you are going to approach the development and what things need to be done in what order (i.e. what will cause a delay if its not done in time!).

Keep work segments small
Try and ensure that any given task can easily be completed in a day – that includes researching it, building it and ultimately testing it.  If its going to take longer, then break it down further.  Keeping tasks small like this helps people stay motivated, as they feel that the project is progressing a little bit further every day – and its measurable.
If you find that you are stuck on something, or it’s taking longer than you expected, look at it again – sometimes even scrapping it and starting again totally, although annoying, is the best thing to do.

Make use of people’s strengths
Development is a very broad term, and there are many subtle facets to it. For example, some developers are better at GUI work, others better at service or back office code. Others might be more skilled at doing Silverlight, ASP.NET or maybe Win Forms. Don’t always assign tasks purely based on strengths though – we all like a bit of variety, and it’s important to not find things get too routine.

Use source control
Now this one I really can not stress enough. I don’t care if you use SVN, VCS, Team Foundation Server, GIT, or whatever.
No matter how small your team, you MUST run source control. Personally I have every project I work on (yes, even if I’m the only developer) source controlled. Why? It means I know what I’ve done, what’s remaining to do (Trac or TFS are great for this) and it gives me the all important backups. However, for source control to be really useful, its important to have some rules. Some simple ones that I tend to mandate are:
- Code is checked in at the end of every day. If its not finished (you did read the above though right?), then shelve it (again, TFS, very good), but locks must be cleared and code must go in.
- Run code analysis automatically
- Run gated check-ins. If you break the build, sorry, you are the one that has to fix it.
- Comments rule. Checkin comments should NOT be optional.
- Assign any tickets. If you are using TFS or Trac, or similar, then the work item tickets should be assigned to the checkin. Simple really.