New blog

All new content on my restarted blog is here
Showing posts with label usability. Show all posts
Showing posts with label usability. Show all posts

Wednesday, October 7

Text still rules

This is a really excellent reminder of a web basic, which is unfortunately often forgotten as websites add and add and add and in the process become bloated.

“Think of your Web audience as lazy, selfish and ruthless,” said Michael Gold, West Gold Editorial principal quoting usability guru Jakob Nielsen’s apt description of today’s impatient, task-oriented Web audience during his remarks at a recent ONA panel. “Web audiences are on a mission—they’re task-oriented.”


Text matters on the Web from Martin Ricard on Vimeo.

Related posts of mine:



HT: ONA

Thursday, September 24

Bad user testing beats no user testing



Jakob Nielsen has noted that it's now twenty years since he started what he calls the 'discount usability movement'.

This might be egging it a wee bit, I'm not sure there is such a 'movement' apart from that which Nielsen promotes.

It's true that major companies use discount usability tactics - I noted before how last.fm used it when their site went through major changes. But 'movement'?

Moving on ...

Nielsen presented a paper entitled "Usability Engineering at a Discount" at the 3rd International Conference on Human-Computer Interaction in 1989.

It was born out necessity, he says, as he simply didn't have the budget of the IBM User Interface Institute where he'd previously worked.

The paper advocated three main components of discount usability:

  • Simplified user testing, which includes a handful of participants, a focus on qualitative studies, and use of the thinking-aloud method. Although thinking aloud had been around for years before I turned it into a discount method, the idea that testing 5 users was "good enough" went against human factors orthodoxy at the time.
  • Narrowed-down prototypes — usually paper prototypes — that support a single path through the user interface. It's much faster to design paper prototypes than something that embodies the full user experience. You can thus test very early and iterate through many rounds of design.
  • Heuristic evaluation in which you evaluate user interface designs by inspecting them relative to established usability guidelines.
Nielsen says he was stoned in the market square as a heretic and I can well believe it.

I had a similar experience when, discussing issues with LocalDirectgov's usability offering, I proposed that council web teams should use discount testing methods. This provoked nigh on outrage and a swipe at Nielsen by the usability company Nomensa. I like to think I moved them on from their initial horror to grudging agreement but you can make your own mind up in the debate, as it spilled over several posts and onto a Nomensa worker's blog.
Nielsen even has the nerve, to some people's delicate sensibilities, to say:
Discount usability often gives better results than deluxe usability because its methods drive an emphasis on early and rapid iteration with frequent usability input.
As well as, the horror:
Discount usability methods are robust enough to offer decent results even when you don't use perfect research methodology.

In other words: Bad user testing beats no user testing, every time.

He cites a team that ran a usability study of MacPaint 1.7 (an early drawing program) in 1989 who each tested three users.
Better usability methodology does lead to better results, at least on average. But the very best performance was recorded for a team that only scored 56% on compliance with best-practice usability methodology. And even teams with a 20–30% methodology (i.e., people who ran lousy studies) still found 1/4 of the product's serious usability problems.
Nielsen claims that "my 20 years of campaigning for discount usability have certainly not been in vain, [but] I can't yet declare a win" — and nowhere is this more evident than in government, where cheap-but-effective methods of finding website errors would, you would think, have most resonance.

Both the US (usability.gov) and UK (usability.coi.gov.uk) official government usability advice contain no reference to discount methods.

~~~~~~~

Here's a presentation I gave on discount user testing called Cheap'n'easy usability first in 2006.


View more presentations from Paul Canning.
Reblog this post [with Zemanta]

Monday, September 21

Council homepages: what's wrong with 'interesting development'?




There's been a great flurry of interest in local government webbie circles because a few councils have gone down the Google route of deliberately reducing homepage content and pushing search as the way to find what you're looking for, what you want to do, what your task is.

Lancashire's Kevin Rainsbury told the lively thread on the Communities of Practice (CoP) website that:

We're at a relatively early stage in development but felt it was worthwhile launching in its current state as it was able to provide customers with something better than what they had previously. We did a fair bit of customer research which led us down this path. We're also aware that the new site might "ruffle a few feathers" given it's such an unorthodox approach for a local authority. The Socitm Better Connected review will be of particular interest this year!
(I asked, via Socitm, for more information on any research or prior-to-launch testing within development by Lancashire and Westminster but as of the time of writing this hasn't appeared.)

Webbies divided

On the rest of the thread and in blog post comments council webbies were divided on the sites. Some cheered the innovative approach - 'It is good to know Lancashire is one of the councils thinking out of the box' - whilst others found fault (eg failed search results) or questioned the usability. .

Feeding back from experience, one webbie said:
When I’ve carried out user testing I’ve often found that participants are fairly evenly split between ‘expert’ users who like to search for information using a search engine and ‘novice’ users, who are less confident and like to browse and click on links. Some people just happen to feel more comfortable when they have some hints about what to click on. Even better if the links they can click on are relevant to their goals. In comparison to Lancashire’s site, the Westminster site does place popular tasks under the search box.
This experience reflects not just use of council websites but longstanding experience of websites in general: it's also common sense that users would be split between novices and experts (and a mass in between). However one comment made clear that this understanding hasn't got through to all council webbies:
Search is something that has to be pushed at people more. I get tired of reading complaints from the public saying “I tried to find X on your site, but I HAD to use the search” – like the search on a website is some sort of last resort form of torture.


Forcing rather than following users

Another reviewer noted that Westminster's design included navigation options 'below-the-fold' (meaning that users have to scoll down). This, and other comments, stood out for me as part of an unfortunately common mentality in the public sector, that a design is fine simply because the 'option' is provided - somewhere on the page or via a link. But a lot of users simply won't scroll, or see something 'obvious' to you.

For example, Jakob Nielsen’s study on how much users scroll (in Prioritizing Web Usability) revealed that only 23% of visitors scroll on their first visit to a website. This means that 77% of visitors won’t scroll.

So Westminster and Lancashire are actually doing what the commentator above wants them to do - effectively 'pushing search at people more'.

Hearing from an expert

I spoke with web design authority Gerry McGovern about Lancashire and Westminster.

There are 'rules' which come and have evolved from best practice which comes from experts over many years now observing how users actually behave. There are 'heuristics' which are established principles for user interface design (web pages).

I said:
My understanding of feedback from user studies is that 'search is the user's lifeline when navigation fails', and also from what you have written and said that navigation should be improved. Therefore it is a mistake to hide it on a homepage.

[You] need to look at who the homepage specific audience is and that most users actually arrive elsewhere so a homepage focus can be a distraction from addressing user's main needs, such as being able to complete tasks from wherever they arrive and ensuring that search and referral traffic is sending them to their goals. As well, that reliance on search means a lot of working on tweaking and refining results and results presentation.
Gerry replied:
I think it’s an interesting experiment, particularly what Lancashire is doing, but I basically agree with your points. Search and navigation should go together. Often what happens is people search to get roughly in the right direction and then navigate the rest of the way. But many will, as you point out, navigate once they’re presented with good logical links.

I think Westminster has got more of a balance; they have brought the top tasks onto the homepage as well as the big search area.

The fact that the search will have to be tweaked is a good thing. Really managing search is very important so to have a big focus on quality search is great.

You’re also right about the decline in homepage importance. Because a great many are starting at Google it means that they will often end up on a deeper page.



Why does this keep happening?

The apparent absence of user-testing within an iterative design process - what I would describe as standard industry best practice - happens in the case of council website design as a result of, as Carl Haggerty:explained in his blog post about the Lancashire and Westminster developments, "a wide range of influencing factors that will impact on the local webteam to make particular choices."

He identified those factors as:
  • political pressure
  • resources
  • role of communications in website
  • role of ICT in website
  • role of customer services in website
  • location of webteam in organisation
  • external influences such as Socitm Better Connected, Gerry McGovernp plus many, many others
  • which conferences members of the webteam have attended (web, social media etc)
  • and yes last but not least our customers needs – all the above shouldn’t matter but they do.
I don't think it's chance that Carl happens to put customers as the last bullet point but I think council webbies are scoring an own goal in their desire for website improvement if they don't prioritise customer feedback, especially through user testing.

For example, and this is one I have cited before. When I conducted guerilla testing for a new design it became immediately obvious that the main link through to online services was simply not being seen by users. All of them were missing it, it might as well not have been there. This was because of a common, known usability issue but one on which I'd found myself over-ruled as I wasn't 'the decider'.

Because I had gone and done some cheap'n'easy testing and got a unanimous result I was able to get that design redone - because what would the counter-argument be? 'I know better than the users'?

The customers are the biggest weapon in a council webbies armory against those factors that Carl cites but how often are they used?

Is design consistency a bad thing?

Carl asks:
Why are we all taking a separate view, if we all have the same goals in mind, why haven’t we all developed identical looking sites with just a logo or some colour change as the main difference?

Shouldn’t we all agree to a consistent approach, purpose and some principles for local government websites (including the homepage) that we can at sign up to?
I wonder about this too.




Many governments, such as in Canada, Hong Kong and Singapore, have adopted common look'n'feel policies, which dictate design boundaries. Others, such as the American government and various Australian states, have provided for some years now both policy and guidance as aides for government webbies.

The UK doesn't have any of this.

The new COI usability guidance isn't meant for local government sites - and doesn't mention guerrilla testing. Socitm's Better Connected has radically improved from being a lengthy tick-box list to honing its message but still has some way to go.

Exercises such as my friend Dave Briggs' 'crowd-sourced' What makes for a decent Council website? have serious problems in my opinion with introducing bias, their usability and not necessarily being customer-led. I also remain unconvinced by Idea's developing 'Knowledge Hub', for similar reasons.

To answer Carl's question - "why are we all taking a separate view, if we all have the same goals in mind?" - I would look at:
  • how webbies are being and have been led (hello DCLG)
  • what resources they have (why they are so disparate or non-existent, why so many 'best practice' experiments keep getting funded and keep failing) and 
  • why, a decade into very well-funded national egov/transformation policy, council webbies still remain all over the place on the basics of designing successful, customer-driven websites
You have to ask these sort of hard questions (and others, such as I'm not at all sure if "we all have the same goals in mind") to truly answer Carl's.

Simply put: I would question, with I think good cause, a rush for 'innovation' whilst some extremely basic yet to me obviously un/der-recognised problems still exist.

~~~~~~

Addendum: Following a Twitter exchange with local gov web manager Julian Scarlett, a point occured to me with both common look'n'feel and better information provision and general support for local government web development.

'Interesting development' requires resources and - with a few exceptional exceptions - this is not found in smaller population districts. This is another reason why there is such a huge disparity with the quality of local government websites.

Sunday, September 20

Postscript two: Lessons from the great 2009 Birmingham City Council website disaster



Following the almost universally badly received launch of the new Birmingham City Council website local developer Mark Steadman posted a challenge on his blog:

Why don’t those who are busy complaining and building independent fixes to problems that only concern people who know or care what hashtags are*, get together and build an alternative Council website? Something that’s a real resource for its users, and doesn’t suffer from the dearth of features the official site does.
In particular he singled out another local developer who had joined the negativity, Stef Lewandowski who describes himself as 'Creative entrepreneur and maker of social network toolbox for dads Odadeo.com - Webby winner, Clore Fellow, ideas guy, jack of all trades but master of none!', for criticism and his challenge.

Well Lewandowski has risen to it and in a matter of days has built what's been labeled #bccdiy (screengrab above).

He describes it as:
An unofficial website, aimed at providing a useful service to people in Birmingham based on the contents of the Birmingham City Council website, combined with other tools and services.
It comes from the input of people to the bccdiy wiki I mentioned in my last postcript (the wiki was set up by @jonbounds)

Thus far, I'm not aware of any reaction to this rather incredible and groundbreaking development (which you can follow on Twitter) by anyone from the council but if they don't do anything but welcome it - and it has already been suggested that they may not - then it really is time for local people to start haranguing councillors.

They could start with the Deputy Leader, Tory Paul Tilsley, who virtually came out swinging at a local event full of Birmingham's 'digerati' (or 'twitterati').



Open mouth, insert foot

#recasting was chaired by Charlie Beckett, who introduced Tilsley by praising the council's "magnificent online presence". This drew a response from Tilsley asking Beckett, "could you repeat that for the benefit of Marc Reeves" (the editor of the Birmingham Post, who have printed several articles critical of the website). Tilsley then laughed - but no-one joined in.

(Alison Smith commented on her blog post about the event "we were all too polite to heckle and I wish I had.")

Tilsey said that the council - and bear in mind the venue for his comments - was engaged in 'innovative activity' and said that the website was part of their business transformation (actually, all council's are doing this as Whitehall has instructed them to) "embracing the whole of the digital agenda over a ten year program. If you take a snaphot you have will negative comments but you need to see the whole picture."

“We’ve come in for a degree of criticism because we did spend a bit of money on it,” he said, my emphasis. “It was completely revamped and you can’t create 87,000 pages without cost.

“That was the size of the agenda that we were tackling to get a product that was responsive.

87,000?

The number that's been quoted by Glyn Evans, Birmingham City Council's Corporate Director of Business Change, thus far has been 17,000, which a lot of people have questioned (it's suggested this includes every council minute or other documents).

But it gets worse. Lewandowski says:
@stef of the 87,000 pages that were quoted I've got a functional site out of the 685 uniques I can actually find to index. #bccwebsite
Tilsley also seems to be either unaware, or actively misled, as council staff transferred the content rather than the contractor - thus adding to the £2.8m they've been forced (via FOI) to state as the website's cost.

The Deputy Leader has invited anyone with questions about the website to email him at paul.tilsley@birmingham.gov.uk. Perhaps a first question could be where he got the 87,000 figure from?

Another local councilor Robert Wright, a LibDem so part of the coalition which runs the council, has addressed - in a manner of speaking - the controversy on his blog. So there's another avenue for giving feedback to local politicians.



What may help speed things up is the arrival (albeit very late) of the Taxpayer's Alliance, who are infamous at getting themselves quoted in the media and provoking politicians into reactions.
Hang on, a £2.8m website that has taken literally years to construct isn’t world class? How much does a world class one cost then? Billions?! And if we’re to believe all the boastful publicity we’ve paid to have papered around this city, and the protestations of Cllr Mike Whitby, isn’t Birmingham a world class city? "A Global City with a Local Heart"?

Well if it is, according to this feedback, it’s a global city with a pretty crummy website…
In other feedback ...

Paul Robert Lloyd, Visual Designer at Clearleft, has pointed out that the council next door, Walsall, is doing a far, far better job (others have cited nearby Lichfield).

Ross Riley, technical director of Birmingham digital agency One Black Bear, comments on "another catastrophic disaster in a long line of public sector web projects".
The running theme throughout the site is the complete lack of even a hint of quality. There's the amateur feel of the graphics in the header, the massively bloated size of the pages, the search facility being left open to Cross Site Scripting (XSS) attacks, the painfully slow load times and the lack of any design input or consistency throughout the entire site.

There's only two possible reasons as to why this project has ended up as such an expensive disaster. Either the team running it had no expertise in online projects and failed to see that they were being overcharged for sub-standard work or someone on the project team is plotting an escape to Panama with a couple of million in notes.
On Reddit programmers find a whole stack of issues with the website.

Someone has noticed that Birmingham's Respect councillor is the only one who does not have a webpage.


Past posts:

Saturday, September 19

Google UK offers free webinar to council webbies on 'conversions'



As part of Google UK's increasing local engagement it is running the first in a series of Webinar's next week.

This one is about 'conversions', getting users through a process on your website and sucessfully out the other end. This process may start from them clicking on an ad or being directed from the homepage or entering the process from an organic search result.

All websites lose people as they drive them towards a goal - which can be paying for a book or ordering a new rubbish bin.

When I presented at the Google local government event I mentioned that even top websites like Amazon - for whom this means lost revenue, lost profits - lose significant numbers of people along the way despite testing and testing and testing again. From memory this was anything up to 20% but, as I was with folks who know way, way more about this stuff than me I double checked. 'Yes' the Googlers nodded ...

For local government I suspect that a lot more than 20% of transactions are lost and this means a number of things:

  • ' Service transformation' is in large part about encouraging self-service which is in large part about getting people to do stuff online - if the process doesn't work for many or even most of them 'transformation' simply isn't going to happen.
  • If people have a bad experience with a process it will be that much harder to improve it and get them back. Plus they will tell people about their failure.
  • If the numbers of people successfully completing online transactions are below expectations then this undermines those promoting and developing them - including budget allocation.
  • Conversely, if expectations are too low - as I believe they usually are - bad processes become tolerated and it is that much harder to argue for money to be spent on improving them, for example of user testing
The webinar marketing is aimed at commercial organisations, it talks about "converting into paying customers". But for users a process is a process online, so for local government every single point made will be applicable and this is why Google's UK local government people want council webbies to watch.

Here's what it will cover:
A. Understanding Your Visitors
B. Maximising Your Traffic
C. Traffic Sources
D. The Homepage
E. Landing Pages
F. Bouncing Visitors
G. Visitor Journey Steps
H. Exits
I. Site Search
J. Go
It will be conducted by one of Google UK's certified Conversion Experts and the first one takes place 24 September at 3pm.

Saturday, September 12

Postscript: Lessons from the great 2009 Birmingham City Council website disaster



The fallout from the relaunch of Birmingham City council's website (#bccwebsite) has continued, not just online but in the local press as well thanks to the strong interest of Birmingham Post Editor Marc Reeves.

It's not a coincidence that the Post has a 'web 2.0' site and in its reporting about #bccwebsite has even included comments left on its news stories as well as comment sourced from online feedback - including mine.

But by far the most interesting developing is the first step in the concretisation of webbie attempts to influence the council's web development in the shape of a wiki (capture above).

This echoes some of the major developments bringing together webbie citizens and government around the world, such as the just launched Code For America who say "we believe there is a wealth of talent in the web industry eager to contribute to the rebuilding of America."

It seems there's also a wealth of talent wanting to contribute in Birmingham as well. Whether they will be heard depends as much on their own abilities at lobbying as creating useful stuff online, alongside the council's abilities at listening of course. Let's not start off being cynical!

Socitm's Helen Williams noted on the Communities of Practice public sector social network my main point, that - from this disaster (and acknowledging it as a disaster) - Things Must Change:

I think there are some specific reasons for the level of interest (reaction to the high cost, perceptions of the third party contractor, the size of the council / city and the fact that it has many digitally active citizens). However, I also believe we have entered an era where councils can expect their websites and any re-launches to receive a whole new level of public scrutiny and comment (and not just in the Web Improvement and Usage Community!)


Let down by Comms, and politicians?


As I noted before, part of the disaster has been a PR one.

In private conversation I have expressed my sorrow for the staff member forced to step up to be the fallguy. Glyn Evans, Birmingham City Council's Corporate Director of Business Change, responses have been unfortunate but betray a lack of help from the Council's PR department.

Even more unfortunately why is he the fallguy and not a local politician? Doesn't it say something for the political leadership's attitude to their website - and possibly explaining why they got in this mess in the first place - that they can watch it be trashed in the press and online and say nothing?

Birmingham Post Editor Marc Reeves agrees with this analysis in a comment on my blog post:
I don’t know Glyn Evans very well, but I do know he’s an effective and passionate public servant who cares deeply about doing the best for the city of Birmingham and its citizens. Much of the opprobrium has centred on him, which I think is a little unfair, although no-one should shirk from holding him up to the ‘cabinet responsibility’ principle.

However, he has been let down by the absence of a cohesive, proactive strategic comms process which – if it existed – would have spared the council some devastating reputational damage, and Glyn this undeserved personal and professional embarrassment.

A well thought-out public affairs / public relations approach to this simply wouldn’t have let the website ‘launch’ proceed. The simple expedient of quietly announcing that the first ‘below the line’ phase of the web overhaul was complete, with functionality to follow, would have avoided this mess. If, as Glyn says, there are major improvements on the way, then simply wait for them to be up ad running, then unveil all in a hail of publicity.

The website scandal just illustrates a much larger problem at the heart of BCC, I fear.
The non-reaction from local politicians (bar Sion Simon MP's retweet of my post) shows why the effects of the Birmingham digerati (aka 'twitterati') need to be as much political as they are digital.



Amongst other developments
  • It has been suggested that the task of transferring content fell to council staff rather than the contractor. Also that the statement '17,000 pages' actually means 17,000 content items.

  • An old post by Charles Arthur in The Guardian has surfaced which contains the claim that the project was "essentially trying to knit 35 sites operating under the council's umbrella into a single one. "

  • Reviewers have noted that the forms system, at the heart of any possible cost-savings and 'service transformation, is extremely outdated with "a bewildering array of options?" and has bits which simply don't work.

  • Reviewers have also noted that there are payment forms with no encryption.

  • Questions have been asked about whether there was any public consultation as well as pre-launch testing/breaking and fixing (including usability testing), partly as comments by Evans have suggested this is happening post-launch - "there is little point in assessing our residents' perspective – the view we value most – at this stage".

  • The exact role of the council's web team in the exercise remains a mystery.
  • The CMS wasn't built by Capita, it was a commercial java-based CMS called FatWire (source: Stef Lewandowski).



Going over the top (in another sense)

In a great post about the website, local web business owner Jake Grimley has bravely nailed his colours to the mast (more people with potential business to lose will have to follow Jake if efforts like that started by the Wiki are to succeed) and made concrete suggestions from his own experience of developing large and mission-critical websites. He also goes with my guess on what £2.8m actually bought.

He says:
For me, it’s not really the lack of RSS Feeds (inexplicable as that is) or the failure of the CSS to validate, or the difficulties keeping the site up on its launch day that really bother me. It’s the complete lack of attention to detail or quality in the content, design and information architecture that I find astounding. For those that need examples, there is a log of snarky highlights, but you just need to spend five minutes clicking around the site to see what I mean. It’s the equivalent of re-launching the Town Hall with bits of plaster falling off, missing roof-tiles, and sign-posts to facilities that never got built.
Another great follow-up post by Pesky People details the accessibility issues and comes out fighting:
At the moment they are saying Disabled residents in Birmingham are not important enough as Gly Evans was quoted in the Birmingham Post
All the signs are that this one will not go away, for the reasons Helen Williams outlines. But it remains to be seen if the 'Lessons from the great 2009 Birmingham City Council website disaster' will actually be learned any time soon. That's down to all of us, including you, the reader of this blog post.

~~~~~~

One other consequence

As I endeavored to make clear in my previous post, none of this should be taken as a criticism of the City Council's Web Team. It is unfortunate that it appears that this may not be the case. It would be more than good to hear from them as developments continue - it would be invaluable.



Reblog this post [with Zemanta]

Wednesday, September 9

Lessons from the great 2009 Birmingham City Council website disaster



The night before last - and in the night - Birmingham City Council without much fanfare switched over to its rejigged website.

Within moments the twittersphere was alight. It was crashing, it had obvious faults and it looked terrible. Over the next 36 hours reviewer after reviewer found fault after fault.

This would not be news - how often do bad government websites get launched? - if it hadn't been for the efforts of Paul Bradshaw (a lecturer from Birmingham in online journalism) and his project Help Me Investigate.

Following up on local Brummie whispers, they put in an FOI to the council and discovered that the cost of this website revamp was, wait for it, £2.8m.

What they bought

Since the launch the lead for the council, Glyn Evans, director of business transformation, has written two blog posts and done a couple of interviews.

From this - not from information provided as a result of the FOI, because of 'commercial confidentiality' - it is possible to glean that this £2.8m essentially covers building a content management system (CMS, presumably from scratch a commercial java-based CMS called FatWire), transferring the content and building a workflow system (and there's a mention of training staff). Oh and ‘find my local’, which much smaller councils have had for ages.

From this new CMS they will be able to offer better (i.e. now standard) online services integrated into it and basics like RSS feeds - but nothing suggests that the cost of those developments is included and any webbie worth their salt will ask why any CMS should cost that much, even with bells and whistles and silver-service catered training sessions attached. Unfortunately we're unlikely to ever know real details, because of that ever so useful cover-up tool of 'commercial confidentiality'.

On the reality of what it has actually bought, the council admits at much in its response to the FOI:

Yes, the new council website is costing more than was originally estimated but actually it’s not an overspend. Originally, the website replacement project was just that – a replacement for the (obsolete) technology we are using. And then we rethought; the new website will be at the heart of delivering improved services to everyone who lives in, works in or visits Birmingham. This is a totally different proposition, and a totally different approach – and budget – was required.

Also, yes, we did consider Open Source but no, it wasn’t suitable, otherwise we’d be using it.

And again, yes, the website is being delivered later than was originally planned but it’s not the same website as was originally planned."
None of this sounds other than just a CMS upgrade. And someone should ask for the papers on just how they decided that Open Source wasn't a solution.



Expensive leaches

Why was it £2.8m? It is hard to see how it could be that much - other than that the job was outsourced to the infamous public sector contractor Capita.

Says Stuart Harrison (aka pezholio):
By all accounts the web team have had very little involvement and most of the grunt work has been done by the council’s outsourced IT ‘partner’ Service Birmingham – operated by everyone’s favourite outsourcer Capita (or Crapita if you read Private Eye).
Stuart's comment about the rumours of the exclusion of the council's own webbies from the website development process is underlined by a comment on Josh Hart's blog about the irony of the City Council Website Manager giving a presentation about the semantic web presenting about triple-tagging at a local geeky meeting in a personal capacity, something as related to his own website as space travel is to an Indian villager.

I can relate. In my past capacity in local government there was a vast distance between my role in the 'egov community' (respected, invited to speak, present and comment) and my actual - professionally disrespected - influence in the council.

Every local gov webbie will have a tale or a tale told to them about the uselessness of Crapita - I certainly have one, let's just say that usability isn't part of their culture, or knowledge from my experience (and according to reports about Birmingham 'external testing' wasn't included in the original cost estimate) .

When tested the expensive system Crapita produced for Birmingham had a failure which just takes your breath away - it did not recognise pound or euro signs, apostrophes and quotation marks. They are a leach on the public sector who, like Rasputin, refuse to die despite failure after failure.

What they have produced is a disaster with failures thus far noted being:
  • It does not meet basic accessibility requirements. Twitterers noted that it launched with one accessibility statement which was changed the next morning to another which was more non-committal ('meets standard' to 'aims to meet standard').
  • Stacks of incoming links have broken including those at the top of Google natural search results.
  • Masses of internal links are broken.
  • Content remains often simply appalling such as an empty 'What’s New' page.
  • Reviewers say online forms are broken.
  • Absolutely no thought given to SEO
  • Incomprehensible alt tags and title attributes for links and images
  • Lack of validation on many pages



Absent PR or just arrogant PR?

The manager sent out to respond to all this, Glyn Davies, has been rapidly digging himself into a bigger and bigger hole. Completely refusing to address any of the major faults found instead he has been reduced to accusing questioners of being members of the 'Twitterati' and pulling the stunt of finding his own expert to claim the website is all shades of stunning.

It is a major irony that the council's PR department actually isn't within the website, a while back they launched their own separate site complete with YouTube channel and Twitter feed. They obviously have the power to shape their own online destiny but not the power to control the council's messaging - Davies' statements have been a PR disaster with the Birmingham Post's editor furious and undoubtedly vengeful.

The effect on the city

Another element to the PR disaster is the effect on the attempts by the city to grow its digital sector.

They have reacted with sharp intakes of breath, expressions of horror and rapid distancing.

Martin Belam presaged this in his comment on the first news of the cost:
There is a rather fantastic irony of it happening in the city outside of London which has perhaps the most vibrant and vocal digital scene in the UK.
Successful local enterprise Clarity Digital said:
The most frustrating part is that on the one hand the City Council, AWM [Advantage West Midlands] and others want to develop the regions’ creative offering. On the one hand the City is encouraging digital and talking up the region’s ability to lead the economic recovery through a thriving, innovative digital industries. And then they release this nonsense that does nothing to help the credibility of these ambitions.
The Council’s response was that the site is for the residents and not the Twitterati. That comment alone shows the sheer lack of understanding of digital in general and social media in particular. Most of those commenting were residents, business owners or people working in the City. Twitter enabled us to discuss this, for people to air their views. Prior to the Internet, a story like this would have resulted in a pile of letters to the local newspaper. Now people can discuss and debate via a range of sites, blogs and of course Twitter.
In a brilliant post which attracted a stack of comments Jon Hickman, from the research centre of the Birmingham School of Media at Birmingham City University, looked at the problems of managing such a large project, and the possible solutions.

The mass of concerned contributors points to the wealth of local - and national - professional goodwill which a local authority could draw on to improve its website if only it would get its head out of the sand (or its arse) and stop digging itself deeper into a hole of its own making.

Said Jon:
So if the root of the problem is the procurement process, what would I suggest instead? I’m going to be completely idealistic, and I do realise that there are quite huge barriers to this sort of change, but I have an answer. Here’s a manifesto for better procurement of web projects (it would work for IT projects and service design too):

1. Let’s focus on needs and the original problem, not best guess briefs by well meaning non-experts.
2. Let’s be aware from the outset that your best guess cost is going to be wrong, and the project will be late.
3. Let’s allow for changes in a radical way: not just contingency days in the spec, or a change management process.
4. Let’s budget for innovation.
5. Let’s open up the project and let users shape the end result.
Amongst the fascinating comments Jake Grimley (Managing Director of Made Media Ltd. in Birmingham) said:
In my experience, the developers and project managers tend to care deeply about the problems they’re solving and feel that they are fighting a daily battle against organisational politics, third-party software vendors, and basic client-misunderstandings to get those problems solved. For that reason all websites are usually a compromise between the ideal, and the practicalities that have to be accepted in order to deliver.

And that last word is key. Steve Jobs said that ‘real artists ship’. Solving the problems in new and innovative ways is nice, but delivering *something that works* is more important. When we work with larger public sector organisations at Made, we sometimes find that delivery is not really at the core of the organisation’s ethos. For that reason it has to be us chasing the client, driving delivery rather than the other way around. That commitment to delivery takes sustained effort and joint experience, which is what would make me sceptical about your ‘hire a bunch of freelancers and give them six years’ approach.

Other than that, what you’re describing is similar to agile software development methodologies. These always appeal to programmers because they saves one from having to think it all through before you start coding. However this methodology is completely at odds with the classic public sector tendering system, which attempts to specify the project in its entirity before even speaking to prospective developers. But then, if you had £600K to spend, wouldn’t you want to know what you were going to get for it before you signed the contract? In addition, agile methodologies do not work well in large political organisations, because they rely on a client sponsor who’s not afraid to take quick decisions when presented with alternatives. Does that sound like Birmingham City Council to you?

[Solution?] This one’s quite simple I think:

Commission a specialist website development company with a track record of delivering complex websites to budget and to timescale, rather than – say – a generic IT outsourcing company.

The irony here is that the reality of the web team putting together the new BCC site is possibly uncannily similar to the ‘ideal’ process Jon outlined.
Another commentator was Brummie developer Stef Lewandowski:
From experience of public sector tendering if you don’t match the brief fully in the pre-qualification stage then you have no chance of getting shortlisted. So the larger the project, the smaller the chance that the commissioned web company will have to influence the brief they are responding to?
These comments show both the willingness of professionals to help but also the frustration and drawing away from government which the actions and comments of so many 'lets-pat-ourselves-on-the-back' bureaucrats like Glyn Evans provoke.



Lessons

What is the number one lesson I'd like readers in local government to take from this abject disaster? Rebellion.

Local government webbies did not cause this (as Stuart Harrison points out) and would not have made the series of bad decisions which led to the ship hitting the iceberg. Why? Because they're professional webbies and know what the heck they are doing.

This project was not run by them, it was outsourced by non-webbies to a company which is not about the web and doesn't live to build brilliant websites and which operates in an uncompetitive environment. Put simply, organisations like Crapita can get away with this and move on to the next excess and profits binge.

This disaster is the best argument for the new Public Sector Web Professionals organisation. Webbies need to develop the clout to call out those responsible for this and for the other council website disasters coming down the pipe.

Another point here is that local government website disasters know no party - all of them are responsible somewhere in the country for crap websites. In Birmingham it was Tories and LibDems, elsewhere it is Labour.
In Brum Deputy council leader Paul Tilsley, who has overall responsibility for 'business transformation', admitted that he was wary of switching on the revamped website.
He said he feared journalists would be logging on at 9am on the first morning in an attempt to identify any “glitches”. Little did he know it would be webbies (aka Evans' dismissive and patronisingly described 'Twitterati') demolishing his council's website. One can only hope some Brummies make this political and hold Tilsley's feet to the proverbial fire.

What unites those wanting change is not party but profession. Those webbies in Conservative HQ and Tories elsewhere need to link up with the rest of us to raise our collective status.

We all need to reach out professionally to those who have commented, those who have shown anger and interest in this disaster and raise our collective voices and shout 'NO MORE!' We need to get together and elbow the Glyn Evans' of this world of ours out of the way because we are citizens as well as webbies and as such we know we can do better than this crap served up to us as 'transformation'.

This episode should be used a an object lesson in how-not-to-do-it and it should mark a turning point for web professionals in local government.

Make it so. Yes we can. You have the power.


Reblog this post [with Zemanta]

How to write a good tweet

TweetyImage via Wikipedia

My guru Jakob Nielsen has published on research Nielsen Norman Group has just completed on Tweet usability.

To my knowledge, no one else has done this sort of research although many of the recommendations have been picked up elsewhere and some are plain common sense (not that that often stops people ignoring sense ... !).

He walks us through their design of a promotional tweet, going through five iterations to end up with:
LAS VEGAS (October) and BERLIN (November): venues for our biggest usability conference ever http://bit.ly/UsabilityWeek
Here are some points for this tweet's development (with some embellishment from me):
  • Capitalising city names draws the eye, breaking up the quick scan
  • Because when people scan they typically only read the first few words of a sentence, those first words need to be information-rich
  • Promotional tweets can be ignored so include some sense of news /new to make them useful / less obviously promotional / more compelling
  • Tweets should be 130 chars or less to allow for retweeting
  • Full sentences aren't necessary in short content, which users are scanning, so ruthlessly chop unnecessary words and use quickly comprehensible characters like '+' and ':'
  • Use a meaningful URL - which may appear elsewhere alone and out-of-context
  • A tweet should be highly focused and not try to make multiple points
Nielsen frets over when to tweet, concluding that 7:51 a.m. Pacific time is his best option as it also catches Europe.

However for others than him it may be more useful to use a service like Twittertise (which allows you to schedule tweets) to overcome Twitter's ephemeral, stream driven nature (Nielsen says that with click-through decay, Twitter time passes ten times faster than email time) and hit more of your likely audience amongst your followers. Obviously don't over-egg this!

Finally he reiterates a point which really needs driving home in my experience as many people don't get it:

Text is a UI

and this applies doubly when you are talking about short text and when you're calling people to action.



Reblog this post [with Zemanta]

Tuesday, August 25

Obama's 'Cash-for-clunkers' has a major form usability #fail

COLMA, CA - JULY 31:  A sign advertising the '...

Image by Getty Images via Daylife

Jakob Nielsen points me to an astonishing statistic from the cash-for-clunkers programme currently being hailed as a great success by the White House.

The multi-billion $ scheme where old car models can be turned in for new and get a rebate is designed primarily to boost auto sales rather than green America's roads

From the New York Times:
The government is tripling the size of the work force assigned to handle the applications.

In many cases, the administration says incomplete forms or errors in the information submitted by dealers are slowing the process. Workers have reviewed about 40 percent of the applications filed, and many have been rejected and then returned to the dealer for possible resubmission.

Laura Sodano, a sales manager at Curry Chevrolet in Scarsdale, N.Y., said dealers were not told why their applications had not been approved and were having to review the entire form to determine what went wrong.
The New York Times doesn't say it so Jakob has to:
The 13-page form(!) is too complicated and many people fill it in wrong, leading to double work in both car dealerships and the government agency processing the applications.

Think of how much hassle and work they could have saved if they had spent a few days on usability and iterative design before inflicting this form on the public. The same user-testing methods can be used for paper forms as for online forms, and the error rate could have been cut to half of the current numbers by a day's worth of iterative design and testing. (It's often possible to cut errors to one-fifth through a few weeks' work.)
Jakob also points to another New York Times piece which reminds about one of the oft-forgotten basics for usable forms, plain English.

John Aloysius Cogan Jr, the executive counsel for the Rhode Island Office of the Health Insurance Commissioner, talks about the need for forms (and policy documents) to match an eight-grade reading level.

He says:
The health care reform bill now under consideration in the House of Representatives includes a proposal that certain disclosures in insurance policies be made in “plain language.” Another piece of legislation now being considered by both houses of Congress would likewise require uniform and simplified coverage information, much like what’s required on nutritional labels. These are excellent proposals, but they do not go far enough. Plain-language disclosures of some policy information and consumer-friendly labels are no substitutes for making an entire policy readable.
Cogan Jr. says that the state of Rhode Island now requires health insurance documents to be written at the 8th-grade reading level:

Says Nielsen:
We have long recommended writing Web content at this level for sites that target a broad consumer audience:

> http://www.useit.com/alertbox/20050314.html

Some designers complain about this guideline, claiming that it leads to overly simplistic sites. But check out the before/after writing samples in the RI article: you'll probably agree that the 8th-grade writing represents the material just fine and is much easier to understand (even if you personally have the skills to read university-level content).






Reblog this post [with Zemanta]

Saturday, July 18

Two flip interviews at #psw09

Socitm held its annual 'Building perfect council websites' event this week at Olympia, London. Must have been over 200 people there and a very interesting line-up of contributors.

I took my Flip camera along and managed to grab a couple of people for short interviews.

Kingston upon Thames councillor (and former Mayor) Mary Reid is a real pioneer in use of social networks in the UK. She's had a blog for a long time and driven a lot of development in her council. Here, continuing from a point I'd made in one of her sessions, she talks about whether the development of social media policy is really needed.



Gerry McGovern is an old hero of mine. He's basically a usability guru but he doesn't talk like one. His presentation style is very funny, pointing out the silliness of how much web design looks to an outsider - woods'n'trees stuff.



Here's the website for 'psw09' with links to other posts and presentation slides.

It was also the 'soft launch' for Socitm Web Professionals - of which more later!

Socitm web professionals banner

Monday, March 23

Changing paradigms with UI for mobiles


An excellent article by Christian Lindholm, formally of Nokia and now working for Fjord, neatly wraps up where mobile design has come from, where we are and where we might go.

He says that there are three levels to the user experience of mobiles:

  • The highest level I call Bling (this is because, it caters to the visual senses) it contains the visuals, colours, content density and partly motion.
  • The next level below it is Control (This caters to the mind or rationale) This is where the efficiency is created, where one gets stuff done, one navigates into applications, within applications and between applications. It is where services should be integrated. It is much more than functionality, more than an application.
  • The lowest level of a user experience is the Utility level. In this level one experiences such thing as application installation, network control, power management. It is where latency is managed. This level of user experience is almost totally provided by engineering, except when operating at world class level, when UE designers and Engineers co-operate deeply.
He says that most of the current innovation is in the top two layers with only Google doing much with the utility layer - and this makes sense because "they are a utility". What they're doing reminds him of the early days of GMail, which wasn't different to Yahoo or MSN four years ago (except for the space) but has now "becoming a cloud based content platform integrating core elements of your digital life".

He cites the most recent second layer paradigm shifts following on from iPhone's 'bling' touchscreen as pinching, flicking, flipping "into 'back' of application like in iPhone Weather or flipping below, like in the Maps app. In the PalmPre there are the cards and their shuffling, the Chucking, meaning closing the app".

These shifts every phone maker takes up as they operate like 'search, top-right' on websites or other new design 'norms': "they cannot be customised from customer to customer". Fortunately we have yet to see commodification of these new paradigms, through patent enforcement, to the detriment of users.

Where he sees next development is in smarter keys: "Text input on touch screens are simply too bad. People like moving keys and the sensory feeling they provide".

This rings true for me as it's what I went for with my new phone, which is a new one from Hewlett Packard. It has a great slide out keyboard which doesn't just have slightly bigger keys but also gives that slight finger feedback which I found absent on others.

One other area which he doesn't mention is in learning. It occurred to me that my supplier was missing a trick by not offering paid lessons on how to get the most out of your phone. Am I missing something here as I have yet to see a site which does this job (or does everyone not want to admit there's some stuff they just can't work out?)

Sunday, March 15

The age of stupid


Something in Alistair Campbell's blog caught my eye. He, like many others, was lamenting the changes to Facebook's interface.

Last night, I was trying to put a message on Alina's wall to thank her for sending me a Canadian review of my novel, and for doing the New Statesman piece. It went up as a status update. So then I put up something lamenting my failure to differentiate between a message and an update, and added as an afterthought ... 'and where did this new [Facebook] design come from?'

So this morning I tried to work out whether I was already, after just a few weeks, becoming a bit small c conservative about life online, (like those right-wing bloggers who can't get used to Labour people being here now, and pick us up on our twitter etiquette, whatever the hell that is) or whether in fact, the design changes made are just bad changes made for the sake of change.
I will mull all this as I go out on my road bike in this beautiful sunshine, and prepare to watch the new film on the environment, The Age of Stupid, later today. Now that is going to be a changemaker. I just know it.

I would have explained myself better if I had been able to track back through comments on a few earlier Facebook postings. Or if I could find a way, quickly, of scanning through all the comments that came in to various updates in the last 48 hours when I have been away from my desk. But I couldn't for the life of me work out how to do it. I could do it the day before yesterday.

Now the title of this post is deliberate - Campbell's obviously not stupid, some might think the opposite ('evil genius'). But if I've learned anything from reading Jakob Nielsen for a decade it's this, most people using the interwebs are not that good at it. And most interfaces don't work for vast numbers of people most of the time.

Nielsen keeps reporting this impirical truth.

When you have online properties which have in their remit the need to be able to be used by practically everyone surely the need, the techniques, the simple methods, to do this should be front-and-centre?

I don't feel they are in egov though. Oh they're there but they're not front-and-centre, and as Campbell says, in the rush for change you end up sounding like a small c conservative if you say 'hang on a minute'...

But having my contrarian streak, as well as being long in the tooth, web-wise, I will :]

So, with all the buzz about social networking and engagement where is usability? Does this 'stuff' pass the mom test?

Sunday, February 1

Barcamp rocks



The BarcampUKGovweb09 event was, first and foremost, fun.

The format of everyone contributing and the 'controlled anarchy' contributed to this feeling.

Wikipedia:
Barcamp = user generated conferences — open, participatory workshop-events, whose content is provided by participants.

I had two ideas. Tying together blog posts I've done on 'How Obama does it' and 'How Labour isn't'. I ended up sticking this with a 'politics 2.0' workshop idea which was far more prepared (and mostly went over/past my head) and just checking in my 2p worth.


Thoughts:
  • Yes, UK politics is different but American developments has a wicked way of making their way across the pond. There is definitely nothing but value in looking'n'learning at the mechanics and politics of how this experiment is panning out.
  • Much if not most of the participants were focussing on how social media feeds back to decision making, organising and policy. But if new tools allow decision makers to get metrics from how the UK-wide social media discussion is playing doesn't that trump how certain interests and players attempt to intervene?
  • Yes, we do need a UK version of the Huffington Post.
My second contribution was my old chestnut: discount aka guerilla usability testing. I first presented on this back in 2006 and used the exact same presentation!

It went somewhat against the tide of contributions and, as a mate said, 'ow, something about the web!'

It wasn't a big crowd (grand total, seven) but consisted of some VIPs in this small egov world. It was also almost a collaborative presentation as people chipped in with their ideas and experience. Which was great!

As well, a guy from the Office of the Qualifications and Examinations Regulator, Philip McAllister, attached his presentation idea to mine (this is how Barcamps work) so we got some fascinating stuff at the end of the hour about his work on qualitative analysis of his site. This fitted well as it was, again, not a 'black art' but simple to do, with vital results generated for understanding how your website (aka brand) comes across.

Thoughts:
  • One thing I've learnt from doing this in practice is the importance of social skills by people trying it.
As I said, the day was lotsa fun. But there are a coupla issues.
  • This format doesn't work so well for the less confident, unless this is managed. I can think of a few people I know with much to contribute (I'm thinking of you, GoogleMaps genius man) but far less skills to allow them to do it.
  • Boystown. Male, male male. Summed up by the changing of the notice above the feeding area from 'Would you expect your mum to clean this up?' to 'Dad'.
  • The great mix of inside/outside/across government was welcomed by everyone I spoke with. Therin lies a nerve to work.
And this note:
And this to-do:
  • Join Twitter. Sigh.
~~~~~~~

Simon Dickson has a good round up here. (And it was lovely to finally meet you too :} )

He talks about this stuff from DirectGov, which very much interested me. I haven't posted this before, but they have widgets on the way — something I've been carping on about for, owh, two years? Though more from a marketing angle than from the angle BarCamp would take.

Here's all the tagged content.

Thursday, December 18

'E-democracy' and, er, democracy


Jakob Nielsen has produced for Pew a very interesting usability study of voter information websites from all 50 united states and the District of Columbia.

Some points of interest. He identifies these neglected usability aspects:

  • Homepage usability
  • Search
  • Accessibility
  • Web presence (that is, how users get to content from outside the site, or "usability-in-the-large")
And makes this spot-on comment on his results:
there's a negative correlation of r=-.1 between homepage usability and accessibility ... the negative correlation indicates that designers aren't treating accessibility as a component of user experience quality. Most likely, government agencies are focused on complying with legalistic accessibility regulations instead of trying to make the sites easy for people with disabilities to use.
As an observer of 'e-democracy', where's usability in the mix? Well it's nowhere - because it's simultaneously nowhere in egov. That's true of the UK and - Nielsen suggests - the US also.

Not very democratic, I'd venture to suggest.

According to E-Access blog, Robin Christensen now of AbilityNet and formally of the RNIB reviewed the FAB! NEW! WEB 2.0! Number 10 Downing Street website and found it wanting:
While relatively accessible in many ways, still has various untagged links which read simply ‘click here’ [sic], offering the audio browser no clue as to what lies behind. The website also features auto-start videos, with unlabelled control buttons, so that blind users are confronted with video noise drowning out their own audio controls and cannot work out how to turn it off.
A very polite way to put it. Picture the scene ....

Again with the not-very-democratic.

Sez Jakob:
There's a reason that we have a "total user experience" concept to encompass everything that users encounter. It's not enough to have a great design for part of the user interface. Good navigation, say, is certainly a necessity for a great user experience, but it's not sufficient. Offer a bad homepage, and users might turn away before they even start navigating.

We can liken a website's user experience to the metaphorical chain that's no stronger than its weakest link. If any one usability attribute fails, the overall user experience is compromised and many users will fail.
It's all very obvious, really - auto-start videos FCS!! Unless one is sitting inside a walled garden ....

Wednesday, September 3

Scrapbook clips catch up


I've been repeatably hearing a bizarre (to my ears) ad running on the Olbermann netcast - Kraft 'natural, 2% milk' cheese with ... drumroll ... "no added growth hormone". Only in America?

I shouldn't mock. We have crap food here too. But something called Velveeta, which "doesn't need to be refrigerated after opening"???

Google has launched a Elections Video Search gadget which use speech recognition.

Using the gadget you can search not only the titles and descriptions of the videos, but also their spoken content. Additionally, since speech recognition tells us exactly when words are spoken in the video, you can jump right to the most relevant parts of the videos you find.
+ Google kills the Google bomb :{

Hah (sorry, shouldn't laugh).
A disgruntled city computer engineer has virtually commandeered San Francisco's new multimillion-dollar computer network, altering it to deny access to top administrators even as he sits in jail on $5 million bail.
US 'Department of Homeland Security' is seriously suggesting that airline passengers wear 'security bracelets' which would deliver taser-like shocks if they 'fail to comply' - seriously.

Here's another shock horror story in 'the war against tourism':
And it wasn't enough for another woman to show TSA agents nipple rings that set off a metal detector. The agents forced her to take them out.

Mandi Hamlin said, "I had to get pliers and pull it apart."

In Chicago, people like Robert Perry are subjected to exhaustive security checks. He was patted down, his wheel chair was examined and his hands were swabbed, all in public view in a see-through room at the security checkpoint. Perry, 71, is not alone

"It's humiliation," Perry said.

Perry was also taken to a see-through room by a TSA agent when his artificial knee set off the metal detector.

"He yelled at me to get the belt off. 'I told you to get the belt off.' So I took the belt off. He ran his hands down over and pulled the pants down, they went down around my ankle," Perry said.

At that point, Perry was standing in his underwear in public view. He asked to see a supervisor. That made things worse.
Tracey Ullman has a great character, Chanel Monticello, taking da piss outta this shit.

Delightful story about how Karl Rove, aka 'Bush's brain', threatened a webbie:
If he does not "'take the fall' for election fraud in Ohio".
No wonder they're losing online, who'd want to work for them?

Computing magazine had a good-news story about the NHS IT project - the biggest non-military IT project in the world - focusing on Homerton Hospital. All great, practical, working properly, stuff. Pity that a/ it's not easily found on the web and b/ Labour is making nothing of it.

eGov: New figures from NWEGG shows that:
A ‘self-serviced’ web transaction is 24 times less costly than a telephone transaction and 46 times less costly than a face - to - face transaction.
According to the Daily Mail (FCS):
Ministers had so far failed to put sex education on a statutory footing in the national curriculum.
AND
Attempts to search for advice on school computers were often frustrated by filters which block sites containing sexual words.
New York Times piece on the challenges of being a Tekkie in Kenya:
Consider Wilfred Mworia, a 22-year-old engineering student and freelance code writer in Nairobi, Kenya. In the four weeks leading up to Apple’s much-anticipated release of a new iPhone on July 11, Mr. Mworia created an application for the phone that shows where events in Nairobi are happening and allows people to add details about them.

Mr. Mworia’s desire to develop an application for the iPhone is not unusual: many designers around the world are writing programs for the device. But his location posed some daunting obstacles: the iPhone doesn’t work in Nairobi, and Mr. Mworia doesn’t even own one. He wrote his program on an iPhone simulator.

Here's good CRM for you. From an email:

We couldn't help but notice that it's been a while since you've visited Current.com, and it's bumming us out.

If you have a moment, we'd love to hear from you about your experience on Current.com, what did or didn’t work for you, and how we could make things more enticing for you in the future.
Lincolnshire is truly pioneering with eGov. Apart from the ads they are:
As part of their “Accessibility tested by humans” strategy, Lincolnshire’s website will be tested every 3 months by a panel of disabled users with disabilities ranging from cerebral palsy through to dyslexia. Results will then be published on Lincolnshire’s website for anyone to see.
I completely agree with this more 'social' attitude to accessibility.

Here's Whitehall's approach:
The draft had threatened to switch off non-compliant websites altogether, warning: "websites which fail to meet the mandated level of conformance shall be subject to the withdrawal process for .gov.uk domain names". The final guidance issues a similar warning, but using the softer formula 'may be at risk' instead of 'shall be subject to': "Government website owners are reminded to follow the conditions of use for a .gov.uk name (Registering .gov.uk domain names (TG114)). Websites which fail to meet the .gov.uk accessibility requirements may be at risk of having their domain name withdrawn".
Monbiot point:
A few weeks ago the writer Mark Lynas found a counter-intuitive revelation buried in the small print of an ICM survey. The number of people in social classes D and E who thought the government should prioritise the environment over the economy was higher (56%) than the proportion in classes A and B (47%). It is counter-intuitive only because a vast and well-funded denial industry has spent years persuading us that environmentalism is a middle-class caprice
How guardian.co.uk stays atop the pile:
In the past two months, we have started to combine search engine optimisation - talking to the news desk on the paper about SEO-friendly headlines and underlining SEO with our subs desk [on the website] - with our marketing and pay-per-click activity. If you do two to three small things at one time that can be very significant.
Etre (newsletter only) had a great post about 'cognitive illusions', relating this to usability. Citing Bruce Tognazzi from 1989 it notes that:
1) Users consistently report that using the keyboard is faster than using the mouse.

2) The stopwatch consistently proves that using the mouse is faster than using the keyboard.

This illusion reveals a much more important learning: Users' perception of reality and reality itself are not the same thing - which means that you should always verify their claims through research. You should also take pains to validate your own intuitions, because even when you're certain of something, you can still be very wrong.
Etre's blog had an interesting post about a new ATM interface for Wells Fargo. ATMs are thirty years old - proving that usability is an ongoing and never-ending process.

Dave Briggs is running an event in Peterborough relating the ReadWriteWeb to the needs of local government. Check it out.
Featuring case studies from both local and central government, practical exercises to learn more about how social media could be used within a local authority context and plenty of time for networking and chats over coffee.
eGov AU on why UK lgov sites are better than Australian ones. Unfortunately he cites Redbridge :{

Information ain't free. After a long break with no email, I had a notification about a citation of that suicide and the internet BMJ article (which I went to work on). But of course I couldn't read it because JAMA's medical research is behind a paywall.

That's cleared some clips :}

Just this to add - from the online journo Michael J Totten: The Truth About Russia in Georgia:

He raised his hand as if to say stop.

“That was the formal start of the war,” he said. “Because of the peace agreement they had, nobody was allowed to have guns bigger than 80mm. Okay, so that's the formal start of the war. It wasn't the attack on Tskhinvali. Now stop me.”

“Okay,” I said. “All the reports I've read say Saakashvili started the war.”

“I'm not yet on the 7th,” he said. “I'm on the 6th.”

“Okay,” I said. He had given this explanation to reporters before, and he knew exactly what I was thinking.

“Saakashvili is accused of starting this war on the 7th,” he said.

“Right,” I said. “But that sounds like complete bs to me if what you say is true.”

Thomas Goltz nodded.