, , , , ,

Lessons from the great 2009 Birmingham City Council website disaster

The night before last – and in the night – Birmingham City Council without much fanfare switched over to its rejigged website.

Within moments the twittersphere was alight. It was crashing, it had obvious faults and it looked terrible. Over the next 36 hours reviewer after reviewer found fault after fault.

This would not be news – how often do bad government websites get launched? – if it hadn’t been for the efforts of Paul Bradshaw (a lecturer from Birmingham in online journalism) and his project Help Me Investigate.

Following up on local Brummie whispers, they put in an FOI to the council and discovered that the cost of this website revamp was, wait for it, £2.8m.

What they bought

Since the launch the lead for the council, Glyn Evans, director of business transformation, has written two blog posts and done a couple of interviews.

From this – not from information provided as a result of the FOI, because of ‘commercial confidentiality’ – it is possible to glean that this £2.8m essentially covers building a content management system (CMS, presumably from scratch), transferring the content and building a workflow system (and there’s a mention of training staff). Oh and ‘find my local’, which much smaller councils have had for ages.

From this new CMS they will be able to offer better (i.e. now standard) online services integrated into it and basics like RSS feeds – but nothing suggests that the cost of those developments is included and any webbie worth their salt will ask why any CMS should cost that much, even with bells and whistles and silver-service catered training sessions attached. Unfortunately we’re unlikely to ever know real details, because of that ever so useful cover-up tool of ‘commercial confidentiality’.

On the reality of what it has actually bought, the council admits at much in its response to the FOI:

Yes, the new council website is costing more than was originally estimated but actually it’s not an overspend. Originally, the website replacement project was just that – a replacement for the (obsolete) technology we are using. And then we rethought; the new website will be at the heart of delivering improved services to everyone who lives in, works in or visits Birmingham. This is a totally different proposition, and a totally different approach – and budget – was required. Also, yes, we did consider Open Source but no, it wasn’t suitable, otherwise we’d be using it.

And again, yes, the website is being delivered later than was originally planned but it’s not the same website as was originally planned.”

None of this sounds other than just a CMS upgrade. And someone should ask for the papers on just how they decided that Open Source wasn’t a solution.

Expensive leaches

Why was it £2.8m? It is hard to see how it could be that much – other than that the job was outsourced to the infamous public sector contractor Capita.

Says Stuart Harrison (aka pezholio):

By all accounts the web team have had very little involvement and most of the grunt work has been done by the council’s outsourced IT ‘partner’ Service Birmingham – operated by everyone’s favourite outsourcer Capita (or Crapita if you read Private Eye).

Stuart’s comment about the rumours of the exclusion of the council’s own webbies from the website development process is underlined by a comment on Josh Hart’s blog about the irony of the City Council Website Manager giving a presentation about the semantic web presenting about triple-tagging at a local geeky meeting in a personal capacity, something as related to his own website as space travel is to an Indian villager.

I can relate. In my past capacity in local government there was a vast distance between my role in the ‘egov community’ (respected, invited to speak, present and comment) and my actual – professionally disrespected – influence in the council.

Every local gov webbie will have a tale or a tale told to them about the uselessness of Crapita – I certainly have one, let’s just say that usability isn’t part of their culture, or knowledge from my experience (and according to reports about Birmingham ‘external testing’ wasn’t included in the original cost estimate) .

When tested the expensive system Crapita produced for Birmingham had a failure which just takes your breath away – it did not recognise pound or euro signs, apostrophes and quotation marks. They are a leach on the public sector who, like Rasputin, refuse to die despite failure after failure.

What they have produced is a disaster with failures thus far noted being:

  • It does not meet basic accessibility requirements. Twitterers noted that it launched with one accessibility statement which was changed the next morning to another which was more non-committal (‘meets standard’ to ‘aims to meet standard’).
  • Stacks of incoming links have broken including those at the top of Google natural search results.
  • Masses of internal links are broken.
  • Content remains often simply appalling such as an empty ‘What’s New’ page.
  • Reviewers say online forms are broken.
  • Absolutely no thought given to SEO
  • Incomprehensible alt tags and title attributes for links and images
  • Lack of validation on many pages

Absent PR or just arrogant PR?

The manager sent out to respond to all this, Glyn Davies, has been rapidly digging himself into a bigger and bigger hole. Completely refusing to address any of the major faults found instead he has been reduced to accusing questioners of being members of the ‘Twitterati’ and pulling the stunt of finding his own expert to claim the website is all shades of stunning.

It is a major irony that the council’s PR department actually isn’t within the website, a while back they launched their own separate site complete with YouTube channel and Twitter feed. They obviously have the power to shape their own online destiny but not the power to control the council’s messaging – Davies’ statements have been a PR disaster with the Birmingham Post’s editor furious and undoubtedly vengeful.

The effect on the city

Another element to the PR disaster is the effect on the attempts by the city to grow its digital sector.

They have reacted with sharp intakes of breath, expressions of horror and rapid distancing.

Martin Belam presaged this in his comment on the first news of the cost:

There is a rather fantastic irony of it happening in the city outside of London which has perhaps the most vibrant and vocal digital scene in the UK.

Successful local enterprise Clarity Digital said:

The most frustrating part is that on the one hand the City Council, AWM [Advantage West Midlands] and others want to develop the regions’ creative offering. On the one hand the City is encouraging digital and talking up the region’s ability to lead the economic recovery through a thriving, innovative digital industries. And then they release this nonsense that does nothing to help the credibility of these ambitions.

The Council’s response was that the site is for the residents and not the Twitterati. That comment alone shows the sheer lack of understanding of digital in general and social media in particular. Most of those commenting were residents, business owners or people working in the City. Twitter enabled us to discuss this, for people to air their views. Prior to the Internet, a story like this would have resulted in a pile of letters to the local newspaper. Now people can discuss and debate via a range of sites, blogs and of course Twitter.

In a brilliant post which attracted a stack of comments Jon Hickman, from the research centre of the Birmingham School of Media at Birmingham City University, looked at the problems of managing such a large project, and the possible solutions.
The mass of concerned contributors points to the wealth of local – and national – professional goodwill which a local authority could draw on to improve its website if only it would get its head out of the sand (or its arse) and stop digging itself deeper into a hole of its own making.

Said Jon:

So if the root of the problem is the procurement process, what would I suggest instead? I’m going to be completely idealistic, and I do realise that there are quite huge barriers to this sort of change, but I have an answer. Here’s a manifesto for better procurement of web projects (it would work for IT projects and service design too): 1. Let’s focus on needs and the original problem, not best guess briefs by well meaning non-experts.
2. Let’s be aware from the outset that your best guess cost is going to be wrong, and the project will be late.
3. Let’s allow for changes in a radical way: not just contingency days in the spec, or a change management process.
4. Let’s budget for innovation.
5. Let’s open up the project and let users shape the end result.

Amongst the fascinating comments Jake Grimley (Managing Director of Made Media Ltd. in Birmingham) said:

In my experience, the developers and project managers tend to care deeply about the problems they’re solving and feel that they are fighting a daily battle against organisational politics, third-party software vendors, and basic client-misunderstandings to get those problems solved. For that reason all websites are usually a compromise between the ideal, and the practicalities that have to be accepted in order to deliver. And that last word is key. Steve Jobs said that ‘real artists ship’. Solving the problems in new and innovative ways is nice, but delivering *something that works* is more important. When we work with larger public sector organisations at Made, we sometimes find that delivery is not really at the core of the organisation’s ethos. For that reason it has to be us chasing the client, driving delivery rather than the other way around. That commitment to delivery takes sustained effort and joint experience, which is what would make me sceptical about your ‘hire a bunch of freelancers and give them six years’ approach.

Other than that, what you’re describing is similar to agile software development methodologies. These always appeal to programmers because they saves one from having to think it all through before you start coding. However this methodology is completely at odds with the classic public sector tendering system, which attempts to specify the project in its entirity before even speaking to prospective developers. But then, if you had £600K to spend, wouldn’t you want to know what you were going to get for it before you signed the contract? In addition, agile methodologies do not work well in large political organisations, because they rely on a client sponsor who’s not afraid to take quick decisions when presented with alternatives. Does that sound like Birmingham City Council to you?

[Solution?] This one’s quite simple I think:

Commission a specialist website development company with a track record of delivering complex websites to budget and to timescale, rather than – say – a generic IT outsourcing company.

The irony here is that the reality of the web team putting together the new BCC site is possibly uncannily similar to the ‘ideal’ process Jon outlined.

Another commentator was Brummie developer Stef Lewandowski:

From experience of public sector tendering if you don’t match the brief fully in the pre-qualification stage then you have no chance of getting shortlisted. So the larger the project, the smaller the chance that the commissioned web company will have to influence the brief they are responding to?

These comments show both the willingness of professionals to help but also the frustration and drawing away from government which the actions and comments of so many ‘lets-pat-ourselves-on-the-back’ bureaucrats like Glyn Evans provoke.


What is the number one lesson I’d like readers in local government to take from this abject disaster? Rebellion.

Local government webbies did not cause this (as Stuart Harrison points out) and would not have made the series of bad decisions which led to the ship hitting the iceberg. Why? Because they’re professional webbies and know what the heck they are doing.

This project was not run by them, it was outsourced by non-webbies to a company which is not about the web and doesn’t live to build brilliant websites and which operates in an uncompetitive environment. Put simply, organisations like Crapita can get away with this and move on to the next excess and profits binge.

This disaster is the best argument for the new Public Sector Web Professionals organisation. Webbies need to develop the clout to call out those responsible for this and for the other council website disasters coming down the pipe.

Another point here is that local government website disasters know no party – all of them are responsible somewhere in the country for crap websites. In Birmingham it was Tories and LibDems, elsewhere it is Labour.

In Brum Deputy council leader Paul Tilsley, who has overall responsibility for ‘business transformation’, admitted that he was wary of switching on the revamped website.

He said he feared journalists would be logging on at 9am on the first morning in an attempt to identify any “glitches”. Little did he know it would be webbies (aka Evans’ dismissive and patronisingly described ‘Twitterati’) demolishing his council’s website. One can only hope some Brummies make this political and hold Tilsley’s feet to the proverbial fire.

What unites those wanting change is not party but profession. Those webbies in Conservative HQ and Tories elsewhere need to link up with the rest of us to raise our collective status.
We all need to reach out professionally to those who have commented, those who have shown anger and interest in this disaster and raise our collective voices and shout ‘NO MORE!’ We need to get together and elbow the Glyn Evans’ of this world of ours out of the way because we are citizens as well as webbies and as such we know we can do better than this crap served up to us as ‘transformation’.

This episode should be used a an object lesson in how-not-to-do-it and it should mark a turning point for web professionals in local government.

Make it so. Yes we can. You have the power.

Reblog this post [with Zemanta]

Leave a Comment

Leave a comment

Leave a Reply