- Website search engines “often fail to work satisfactorily”.
- Nearly 25% of departments do not know who is using their sites, or how much they cost.
- Some sites are difficult to use, too “text-heavy” and filled with policy material that irrelevant to the visitor.
- between 2001 and 2006 in terms of quality one in six sites had got “significantly worse”
Public Sector Forums reported last month in an expose about the supposed cull of central government websites that Directgov is giving advice to the Americans. They found minutes from a meeting in January which outlined the strategy.
The amusing thing is that the minutes came from the Webmaster's University wing of US government and the UK has b****r all like it. In other words, there is no mechanism to spread knowledge and information about the very failings the report highlights.
This is why usability isn't front row and centre - all of the 'guidance' sits somewhere unpromoted and unloved and commercial lessons aren't transferable because they're usually not seen as relevant (and there is zero traffic between the Web World and eGov). The gaping hole in UK eGov where Google should be is filled in the US.
I had started going through the minutes to point out all the daft bits but, to be honest, I can't be arsed. Here's one blinding example from someone who's obviously never heard of the 'long tail':
Q: How did you use web metrics—analyzing how visitors were using government websites—to support the initiative?
A: Yes, web metrics were taken into consideration. For example, if a site has high traffic, we need to be careful about how to move content since it will effect a lot of people. But more than specific data, we asked higher level questions, like: what content is on the site, what is the purpose of the site, who are you talking to? Answers to those questions were the most important factors in helping decide whether content should reside on the DirectGov portal, stay on the corporate departmental website, or be taken down.