Return to contracting!

Well, after almost two years of working with Money Dashboard working with them building their excellent Personal Finance Management tool, I’m returning to contracting.

Why am I bailing from a full time job and returning to contracting? Lots of people have been asking me this over the last few weeks and I thought I’d actually put my thoughts out into the world – at least that way, it will hopefully stop me having to explain it time and time again! (And, I’d hope, it might help a few people who find themselves in a similar situation make the leap).

First up, it’s not as if I have never been contracting before – it’s something that I used to do when I lived in the South East, and I did enjoy it. But with the hassles of moving, a “steady” job was far more enticing, so I left contracting when returning to Scotland. That was a good five / six years ago. In that time, the regular pay cheque was lovely, but it lacked the stimulation and freedom that I wanted.

A number of things changed in my personal life, as well as my outlook on various things, in the past six months – and it’s in that time that I started actively looking at returning to contracting. The flexibility as well as the time to pursue some ideas of my own appealed to me. That, and a contract opportunity in Hawaii.

Anyway, the long and short of it is that I’m now contracting, trading under Wildfire Software, and will be working on developing a number of applications – some industry focused, some security related (I know, I seem to have developed a thing for security!), but generally, keep your eye out. Oh, and why not follow my “work” twitter account?

Contracting is not for the faint of hearted – you never know when your next job is (let alone your next pay cheque!), but the rewards (and freedom!) are amazing – I’m so looking forward to actually having time off!

This isn’t to say that I’m escaping from Money Dashboard … I still have the need (actually, probably more than ever now!) for Personal Finance Management … and can’t wait to see how the application matures and grows. Now I’m wanting some more contractor friendly tools to appear in it :)

Expect some pictures from Hawaii to appear ;)

Visual Studio 2010 and ALM

Today I was lucky enough to attend a Microsoft / Black Marble event at Edinburgh Waverly Gate on Visual Studio, Team Foundation Server (both 2010) and the general Microsoft ALM approach.

I have been using Visual Studio 2010 and Team Foundation Server 2010 since the Beta 1 days (for both!), so have a good knowledge of them both, but figured it was a good idea to go along and see what I was missing.

And it was worthwhile.

TFS and VS integrated beautifully. You really could not wish for a better combination.  TFS provides an amazingly extensible foundation for your company to base its source control on – but don’t be fooled – TFS is not just another Subversion. It offers the end to end ALM solution – full User Case, UML diagrams for planning, Work Item and resource tracking all built into a warehouse architecture expertly exposed through a combination of Reporting Server and Sharepoint (that hurts me to say … you know how much I dislike Sharepoint!).

One of the most manager friendly functions added in 2010 is the ability to link query data out from TFS to Office products – from Excel to Project, your line manager can now manage you in their comfort zone. Maybe it’s a good idea, maybe not. One thing that was mentioned today was that anything that stopped them constantly asking “how’s it going” was probably a good. Now that is something I can certainly agree with! Getting decent reliable information out to project stakeholders has always been a challenge.

The demonstration of the new Microsoft Test Manager was very impressive too – finally something to allow us to document test scripts (you know those things that you get testers to follow, right?) and have an easy to use framework the testers can use to step through them (and it captures bugs – complete with relevant tracing information!) is incredible. Oh and it all links back to TFS and your dev team. Imagine it, dev’s talking to testers. Whatever next? Removal of management? Maybe not – someone has to read the reports after all :)  What they DO have coming is automated testing – macro-esque based visual UI testing. Currently only available for Winforms app and ASP.NET websites, but reliably confirmed as hopefully getting Silverlight support for RTM. I’m waiting with baited breath – this could potentially save me a LOT of hassle.

Blend SketchFlow is pretty impressive for mock applications too … but I still don’t see it being able to unseat the larger one’s out there in this niche market, such as Axure.

The one thing that seems to missing in all of the ALM solutions I’ve used in my career so far is an effective Release management tool.

But more on that later :)

Silverlight, Cross Browser issues with GetElementById

I’ve recently spent some time tracing cross browser issues with Silverlight 3, and one in particular irked me a little.

With one project, I used the HtmlPage.Document.GetElementById managed code to get information (and ultimately set them) in HTML pages.

However, it seems this does not work properly with Chrome and Firefox. On the surface it all seems to work fine, however, settings values definitely was not working as it should.

So, the work around is a basic JavaScript function such as :

function getValue(elementName) {
var result = document.forms[“formhere”].elements[elementName];
if (result != null) {
return result.value;
}
return “”;
}

And then calling out to this using HtmlPage.Window.Invoke(“doSomething”);

This works :)  A bit messy, but at least it DOES work reliably, unlike the built in GetElementById. Here’s hoping it’s fixed in Silverlight 4.

Windows Azure: Now with OS Version Control!

I expect many developers have been watching the maturation of the Windows Azure platform through its CTP to the recent release – and they may have wondered how we can control what exactly our applications are running on. I mean, what if Microsoft decide to change the operational behaviour of the operating system that your product is operating on – such as they did when they upgraded the Azure toolset last time.

Well, now they have given us a way to control these updates – at least, all the minor ones – critical ones would still override these settings.

For details, checkout this page on the MSDN site: http://msdn.microsoft.com/en-us/library/ee758710.aspx#ServiceConfiguration

Cisco Routers and XBOX 360's

Yeah, ok, probably a strange combination, but I run a Cisco router at home – for security, flexibility and ok, for the general geekiness of it.

Over Christmas I did a fair bit of rewiring for my home network (how difficult is it to find white CAT5e cable these days? Blimey! Thanks Universal Networks :)). But a problem appeared – my XBOX would not automatically get it’s IP address when it powered up. First I thought it was the cable, but after remaking it several times and testing it I ruled that out. Then I threw a bog standard switch in the link and it all worked fine.

WTF.

A discussion with a friend of mine this morning made the penny drop. The Cisco has spanning tree enabled by default. DOH.

A quick

no spanning-tree vlan 1

And all was well – my XBOX now behaves properly :) Now I can listen to music while I’m having to work from home today.

Edit:
Ok, after a message from the friend in question, I’ve decided that I have to finally make an edit to this post to actually name (and shame) my on-tap Cisco Geek.  And I introduce you to Mr Hill, also online at Ninja Badger. And yes he is a *REAL* Cisco Geek.

Hosting, Virtual Hosting, Dedicated Servers .. what's the diff?

I’ve been going through the infrequent hassle of moving my hosting provider.

I previously had a dedicated server (a very nice, very powerful box but I doubt anyone is interested in that ;)) with a company called UK Solutions down in Redditch, but recently decided that as I no longer needed the flexibility of my own virtualised system (and of course, want less hassle to devote to my other budding projects!) that I would migrate everything and terminate the co-lo agreement.

And so the fun began.

I started off a Reseller account with 34sp.com (basically an account that lets me throw all the sites etc up and let someone else deal with the hardware and management – much nicer for me these days!). As they are using Plesk things were a little fiddly to start with (and I still don’t get the icons loading, but hey, it works), but once I got the hang of things it all went smooth as anything. I even get daily reporting straight to my inbox :) And customer control panels. Dear god, that's scary … I can get them to do their own email setup and leave me in peace. Next thing you know, I’ll be charging for hosting. Oh look, flying pig.

I fired off my contract cancellation into UK Solutions with a good couple of months left on my current term, and was rather dismayed to see that they added another 5 weeks onto the term to bring it up to a full quarter period. Seems that there is a 3 month exit clause on the contract. Argh. You would have thought that termination of contract, when you have a decent period left anyway, would let them have a little bit of flexibility on this clause especially when I’d been a customer of theirs for three years without any hassles.

Two points for signing off this minor rant :)

1. Check your contracts … always read the small print

2. You don’t ALWAYS get what you pay for. 34sp.com are a LOT cheaper than UK Solutions, and so far I have absolutely no complaints – even when logging fault reports, things get resolved swiftly. Keep up the good work guys.

Enterprise level Hyper-V - Experiences

Over the past few weeks, us geeks at Money Dashboard have been hard at work building our production environments, and as I am sure you can imagine we hit a few issues.

In order to help anyone else out who might be thinking about, or even is deploying a fairly complex environment around Hyper-V I thought I would share our findings.

First off, as I’m sure you can all appreciate, I am unable to go into any real specific details on our implementation, so some of this information maybe be a little strange, or difficult to follow – bear with me and hopefully if you ever find yourself in a similar situation it might just help you out!

iSCSI and Virtual Machines

If, like us, you are virtualising your SQL Server’s, then do not forget that you will need to bring some iSCSI (or whatever storage system) mounts through to the VM’s.  This on the surface does not pose a problem, but we DID hit problems when pulling iSCSI through. As you can imagine, we are running Jumbo Frames (MTU 9000) on our iSCSI network in order to optimise throughput, but the default VM adapters only support standard packet sizes (i.e. MTU 1500). In order to get around this, you need to use the Synthetic Network Adapter in the Hyper-V VM, and be sure to set the properties to enable Jumbo Frames. You must also have the physical nic on the server set for Jumbo Frames too. Always worth checking with the following command:

netsh interface ipv4 show subinterface

You may, like us, then notice some packet loss on the iSCSI Adapter. In our case this turned out to be something strange going on with the way the Synthetic adapters were behaving with our Broadcom nics (BCM5709 in case anyone is interested!). Disabling all the offload components (TCP, iSCSI and Checksum) fixed the problem, but we still do not know exactly why this was occurring …

QoS …. do NOT forget it

Make sure you split your Live Migration and Heartbeat traffic onto separate networks, and oh most certainly remember to apply those Quality of Service rules on the switchgear.

We forgot, and as soon as Live Migration kicked off, the complete Hyper-V cluster went mental … it thought that all the other nodes had failed, so EVERY node went to start EVERY virtual machine. As you can no doubt imagine, absolute chaos ensued, and the virtual machine disks were corrupted (one catch with using the new Cluster Shared Volumes it seems).

Dell EqualLogic, and BACS

We are lucky enough to be using pretty much all Dell kit, including the Dell EqualLogic PS6000XV as our SAN. One snag we did hit is that you really do not want to route your iSCSI traffic over a virtual adapter created through the Broadcom BACS suite … on the surface it will appear to work, but when you start looking at it carefully you will notice that there appears to be a significant bandwidth limitation creeping in somewhere. Not sure if it was the BACS drivers, or the Dell MPIO driver, but it disappeared when we reverted to use proper physical NICS. Equally, do not forget to install the Dell MPIO drivers into any virtual machines that are using the iSCSI :)

Adding Hyper-V Clusters in Microsoft System Center Virtual Machine Manager

When you are finally ready to add your machines into SCVMM, add the cluster name – not the individual machine names. It seems that if you add the individual machines, SCVMM does NOT treat them as an HA cluster. I haven’t found any logical way of merging multiple machines into a cluster in SCVMM, or any real reason why it doesn't prevent you from adding the individual nodes anyway (it can see they are in a HA cluster configuration after all).

Summary of the kit we used:

Dell EqualLogic PS6000XV SAN
Dell PowerConnect 6248 gigabit switch stack
Dell R200 1U Rack Mount Servers
Dell M1000e Blade Centre

An awful lot of cabling.

Microsoft Windows Server 2008 R2 Enterprise (Both full and Core)
Microsoft Windows Server 2008 R2 Standard (Both full and Core)
Microsoft System Center Virtual Manager Manager 2008 R2
Microsoft System Center Operations Manager 2007 R2

And a ton of custom scripts.

Thanks go to Dave Veitch from Company Net for assisting in the configuration!

Code Regions - Their use and misuse

Over the last few evenings I have been reviewing code done by a fellow developer, and have come to a conclusion.

The .NET code regions (aka Collapse Regions, Collapsible blocks and dear god knows what else in other languages) can really be misused.

Correct use of regions can group common sections of code – for example code that all interacts with the same property (you might have the property itself and also a number of private functions that work with or derive from the property), for improving readability (yes, we all know that we should not have units / classes that size but we all have right?) and generally for being neat (it is cool seeing it all shrink down isn't it?). BUT, likewise you can seriously go overboard.

This code in question has many levels of nest regions, and although it certainly does tidy the code up, it certainly does not help it’s maintainability.

I’m sure I’m not alone in the fact that I typically use a lot of keyboard shortcuts when I’m developing, but adding many many many regions, with many layers of nesting, this makes keyboard shortcuts for the region collapse / expand pretty useless. No matter what you do, you always seem to end up with too much or too little code on screen – either way, it makes development difficult.

Which brings onto a final point … if this is the case (i.e. the code is damn near impossible to understand at a glance due to the structuring in place), doesn’t this qualify as un-maintainable code?  Perhaps we should be setting up rules on TFS and the like to detect the layers of nested regions, and flag them for review when it sees someone going to excess?

Comments, as always, welcomed :)

Visual Studio 2010 Beta 2 and November Release of Azure Tools

I’ve just had a slight dance with the new release of the Azure Tools (that adds VS 2010 Beta 2 compatibility), and thought I would quickly note down the issues I hit – and of course the solutions!

First off, as you can expect, a few things have been renamed:

RoleManager is now RoleEnvironment
ILocalResource is now LocalResource

Don’t forget to release the release notes and recreate your local dev storage databases … but watch out – the command is incorrect – the docs are missing a colon! The command should be dsinit /sqlinstance:<instance>.

Social Networking and Security / Privacy

As I’m sure some of my (two!) regular readers will remember, I have previously blogged about the security (or lack thereof) when it comes to using Social Networking websites – and specifically when you make use of any of the applications on them (here's looking at you Facebook).

What’s more disturbing is that I came across this article today, which is equally concerned about it. It does actually increase my concern about Facebook – I hadn’t realised that when a friend takes a quiz, or makes use of an application, they can actually provide access to MY data through them. Interestingly there ARE some controls on Facebook to limit this, but I have yet to find them through a route in settings – but instead you have to use a direct URL: http://www.facebook.com/privacy/?view=platform&tab=other

Maybe you should all check to see what is accessible (take the quiz mentioned in the article, you might be surprised) and adjust your settings …

I’m all for Social Networking, but please, can we at least have some accountability?