LGBT Asylum News (formally Save Mehdi Kazemi), which I manage, is now on Twitter. Follow to get notified of all new posts.
New blog
Saturday, September 12
Follow LGBT Asylum News on Twitter
Tweet This! ♦ Add to del.icio.us ♦ DiggIt! ♦ Add to Reddit ♦ Stumble This ♦ Add to Google Bookmarks ♦ Add to Yahoo MyWeb ♦ Add to Technorati Faves ♦ Slashdot it ♦
Postscript: Lessons from the great 2009 Birmingham City Council website disaster
The fallout from the relaunch of Birmingham City council's website (#bccwebsite) has continued, not just online but in the local press as well thanks to the strong interest of Birmingham Post Editor Marc Reeves.
It's not a coincidence that the Post has a 'web 2.0' site and in its reporting about #bccwebsite has even included comments left on its news stories as well as comment sourced from online feedback - including mine.
But by far the most interesting developing is the first step in the concretisation of webbie attempts to influence the council's web development in the shape of a wiki (capture above).
This echoes some of the major developments bringing together webbie citizens and government around the world, such as the just launched Code For America who say "we believe there is a wealth of talent in the web industry eager to contribute to the rebuilding of America."
It seems there's also a wealth of talent wanting to contribute in Birmingham as well. Whether they will be heard depends as much on their own abilities at lobbying as creating useful stuff online, alongside the council's abilities at listening of course. Let's not start off being cynical!
Socitm's Helen Williams noted on the Communities of Practice public sector social network my main point, that - from this disaster (and acknowledging it as a disaster) - Things Must Change:
I think there are some specific reasons for the level of interest (reaction to the high cost, perceptions of the third party contractor, the size of the council / city and the fact that it has many digitally active citizens). However, I also believe we have entered an era where councils can expect their websites and any re-launches to receive a whole new level of public scrutiny and comment (and not just in the Web Improvement and Usage Community!)
Let down by Comms, and politicians?
As I noted before, part of the disaster has been a PR one.
In private conversation I have expressed my sorrow for the staff member forced to step up to be the fallguy. Glyn Evans, Birmingham City Council's Corporate Director of Business Change, responses have been unfortunate but betray a lack of help from the Council's PR department.
Even more unfortunately why is he the fallguy and not a local politician? Doesn't it say something for the political leadership's attitude to their website - and possibly explaining why they got in this mess in the first place - that they can watch it be trashed in the press and online and say nothing?
Birmingham Post Editor Marc Reeves agrees with this analysis in a comment on my blog post:
I don’t know Glyn Evans very well, but I do know he’s an effective and passionate public servant who cares deeply about doing the best for the city of Birmingham and its citizens. Much of the opprobrium has centred on him, which I think is a little unfair, although no-one should shirk from holding him up to the ‘cabinet responsibility’ principle.The non-reaction from local politicians (bar Sion Simon MP's retweet of my post) shows why the effects of the Birmingham digerati (aka 'twitterati') need to be as much political as they are digital.
However, he has been let down by the absence of a cohesive, proactive strategic comms process which – if it existed – would have spared the council some devastating reputational damage, and Glyn this undeserved personal and professional embarrassment.
A well thought-out public affairs / public relations approach to this simply wouldn’t have let the website ‘launch’ proceed. The simple expedient of quietly announcing that the first ‘below the line’ phase of the web overhaul was complete, with functionality to follow, would have avoided this mess. If, as Glyn says, there are major improvements on the way, then simply wait for them to be up ad running, then unveil all in a hail of publicity.
The website scandal just illustrates a much larger problem at the heart of BCC, I fear.
Amongst other developments
- It has been suggested that the task of transferring content fell to council staff rather than the contractor. Also that the statement '17,000 pages' actually means 17,000 content items.
- An old post by Charles Arthur in The Guardian has surfaced which contains the claim that the project was "essentially trying to knit 35 sites operating under the council's umbrella into a single one. "
- Reviewers have noted that the forms system, at the heart of any possible cost-savings and 'service transformation, is extremely outdated with "a bewildering array of options?" and has bits which simply don't work.
- Reviewers have also noted that there are payment forms with no encryption.
- Questions have been asked about whether there was any public consultation as well as pre-launch testing/breaking and fixing (including usability testing), partly as comments by Evans have suggested this is happening post-launch - "there is little point in assessing our residents' perspective – the view we value most – at this stage".
- The exact role of the council's web team in the exercise remains a mystery.
- The CMS wasn't built by Capita, it was a commercial java-based CMS called FatWire (source: Stef Lewandowski).
Going over the top (in another sense)
In a great post about the website, local web business owner Jake Grimley has bravely nailed his colours to the mast (more people with potential business to lose will have to follow Jake if efforts like that started by the Wiki are to succeed) and made concrete suggestions from his own experience of developing large and mission-critical websites. He also goes with my guess on what £2.8m actually bought.
He says:
For me, it’s not really the lack of RSS Feeds (inexplicable as that is) or the failure of the CSS to validate, or the difficulties keeping the site up on its launch day that really bother me. It’s the complete lack of attention to detail or quality in the content, design and information architecture that I find astounding. For those that need examples, there is a log of snarky highlights, but you just need to spend five minutes clicking around the site to see what I mean. It’s the equivalent of re-launching the Town Hall with bits of plaster falling off, missing roof-tiles, and sign-posts to facilities that never got built.Another great follow-up post by Pesky People details the accessibility issues and comes out fighting:
At the moment they are saying Disabled residents in Birmingham are not important enough as Gly Evans was quoted in the Birmingham PostAll the signs are that this one will not go away, for the reasons Helen Williams outlines. But it remains to be seen if the 'Lessons from the great 2009 Birmingham City Council website disaster' will actually be learned any time soon. That's down to all of us, including you, the reader of this blog post.
~~~~~~
One other consequence
As I endeavored to make clear in my previous post, none of this should be taken as a criticism of the City Council's Web Team. It is unfortunate that it appears that this may not be the case. It would be more than good to hear from them as developments continue - it would be invaluable.
Tweet This! ♦ Add to del.icio.us ♦ DiggIt! ♦ Add to Reddit ♦ Stumble This ♦ Add to Google Bookmarks ♦ Add to Yahoo MyWeb ♦ Add to Technorati Faves ♦ Slashdot it ♦
Friday, September 11
A win-win in the cloud for UK Public Sector?
Image via Wikipedia
News that Portsmouth Uni has moved all of its students into the cloud with free, multi-lingual access to an advertising free version of Google Apps, including webmail with 7gb capacity, online documents, spreadsheets and calendars, chat and collaboration and site building functions.They have around 30,000 students who will get @myport.ac.uk addresses.
In a deal with Google - which runs a similar discount exception for academe in the US - they get it for free for four years with no ads (Google quoted a commercial price of $33 per annum per user at the #googlelocalgov event last month). They will also - for free - establish application programme interfaces allowing data to be exported if the university decides to move.
The Uni believes that despite recent brief downtime, Google will provide a better service than doing it in-house. Answering critics, they repeat that Google is signed up to the Safe Harbour Act which commits a US company to comply with European data protection standards, even when information is stored outside Europe.
One of the best received presentations at their recent local government event was about Google Enterprise Tools. Given that the Government's CIO, John Suffolk, is promoting the take up of cloud computing it would seem that Google is kicking at an open door and pushing apps will be a key task for them (disclosure: I am currently in talks with Google).
This task is one they are taking up. Their local government targeted website has a savings calculator which specifically compares Google Apps with Microsoft Exchange 2007. Socitm 2009 in Edinburgh in October has Adrian Joseph, Managing Director of Google Enterprise, as a key speaker.
However, furious debate over the take-up of Google cloud in US local government should flash some warning signs about their potential progress, and where a backlash might come from, in the UK.
The discussion over the pond on the cloud computing push by White House lead Vivek Kundra has generated heat and light over issues such as transparency and how the regulatory constraints and certification and authorization should be handled and managed. This shows the potential for the cloud's seemingly, cost-savingly, obvious UK local government advance to become bogged down.
Those with most to lose, such as Microsoft, will undoubtedly have their own strategy to throw mud and it will be fascinating to see how this plays out, including how the incoming agenda and policy setters in Whitehall react.
Tweet This! ♦ Add to del.icio.us ♦ DiggIt! ♦ Add to Reddit ♦ Stumble This ♦ Add to Google Bookmarks ♦ Add to Yahoo MyWeb ♦ Add to Technorati Faves ♦ Slashdot it ♦
Wednesday, September 9
Lessons from the great 2009 Birmingham City Council website disaster
The night before last - and in the night - Birmingham City Council without much fanfare switched over to its rejigged website.
Within moments the twittersphere was alight. It was crashing, it had obvious faults and it looked terrible. Over the next 36 hours reviewer after reviewer found fault after fault.
This would not be news - how often do bad government websites get launched? - if it hadn't been for the efforts of Paul Bradshaw (a lecturer from Birmingham in online journalism) and his project Help Me Investigate.
Following up on local Brummie whispers, they put in an FOI to the council and discovered that the cost of this website revamp was, wait for it, £2.8m.
What they bought
Since the launch the lead for the council, Glyn Evans, director of business transformation, has written two blog posts and done a couple of interviews.
From this - not from information provided as a result of the FOI, because of 'commercial confidentiality' - it is possible to glean that this £2.8m essentially covers building a content management system (CMS, presumably from scratch a commercial java-based CMS called FatWire), transferring the content and building a workflow system (and there's a mention of training staff). Oh and ‘find my local’, which much smaller councils have had for ages.
From this new CMS they will be able to offer better (i.e. now standard) online services integrated into it and basics like RSS feeds - but nothing suggests that the cost of those developments is included and any webbie worth their salt will ask why any CMS should cost that much, even with bells and whistles and silver-service catered training sessions attached. Unfortunately we're unlikely to ever know real details, because of that ever so useful cover-up tool of 'commercial confidentiality'.
On the reality of what it has actually bought, the council admits at much in its response to the FOI:
Yes, the new council website is costing more than was originally estimated but actually it’s not an overspend. Originally, the website replacement project was just that – a replacement for the (obsolete) technology we are using. And then we rethought; the new website will be at the heart of delivering improved services to everyone who lives in, works in or visits Birmingham. This is a totally different proposition, and a totally different approach – and budget – was required.None of this sounds other than just a CMS upgrade. And someone should ask for the papers on just how they decided that Open Source wasn't a solution.
Also, yes, we did consider Open Source but no, it wasn’t suitable, otherwise we’d be using it.
And again, yes, the website is being delivered later than was originally planned but it’s not the same website as was originally planned."
Expensive leaches
Why was it £2.8m? It is hard to see how it could be that much - other than that the job was outsourced to the infamous public sector contractor Capita.
Says Stuart Harrison (aka pezholio):
By all accounts the web team have had very little involvement and most of the grunt work has been done by the council’s outsourced IT ‘partner’ Service Birmingham – operated by everyone’s favourite outsourcer Capita (or Crapita if you read Private Eye).Stuart's comment about the rumours of the exclusion of the council's own webbies from the website development process is underlined by a comment on Josh Hart's blog about the irony of the City Council Website Manager
I can relate. In my past capacity in local government there was a vast distance between my role in the 'egov community' (respected, invited to speak, present and comment) and my actual - professionally disrespected - influence in the council.
Every local gov webbie will have a tale or a tale told to them about the uselessness of Crapita - I certainly have one, let's just say that usability isn't part of their culture, or knowledge from my experience (and according to reports about Birmingham 'external testing' wasn't included in the original cost estimate) .
When tested the expensive system Crapita produced for Birmingham had a failure which just takes your breath away - it did not recognise pound or euro signs, apostrophes and quotation marks. They are a leach on the public sector who, like Rasputin, refuse to die despite failure after failure.
What they have produced is a disaster with failures thus far noted being:
- It does not meet basic accessibility requirements. Twitterers noted that it launched with one accessibility statement which was changed the next morning to another which was more non-committal ('meets standard' to 'aims to meet standard').
- Stacks of incoming links have broken including those at the top of Google natural search results.
- Masses of internal links are broken.
- Content remains often simply appalling such as an empty 'What’s New' page.
- Reviewers say online forms are broken.
- Absolutely no thought given to SEO
- Incomprehensible alt tags and title attributes for links and images
- Lack of validation on many pages
Absent PR or just arrogant PR?
The manager sent out to respond to all this, Glyn Davies, has been rapidly digging himself into a bigger and bigger hole. Completely refusing to address any of the major faults found instead he has been reduced to accusing questioners of being members of the 'Twitterati' and pulling the stunt of finding his own expert to claim the website is all shades of stunning.
It is a major irony that the council's PR department actually isn't within the website, a while back they launched their own separate site complete with YouTube channel and Twitter feed. They obviously have the power to shape their own online destiny but not the power to control the council's messaging - Davies' statements have been a PR disaster with the Birmingham Post's editor furious and undoubtedly vengeful.
The effect on the city
Another element to the PR disaster is the effect on the attempts by the city to grow its digital sector.
They have reacted with sharp intakes of breath, expressions of horror and rapid distancing.
Martin Belam presaged this in his comment on the first news of the cost:
There is a rather fantastic irony of it happening in the city outside of London which has perhaps the most vibrant and vocal digital scene in the UK.Successful local enterprise Clarity Digital said:
The most frustrating part is that on the one hand the City Council, AWM [Advantage West Midlands] and others want to develop the regions’ creative offering. On the one hand the City is encouraging digital and talking up the region’s ability to lead the economic recovery through a thriving, innovative digital industries. And then they release this nonsense that does nothing to help the credibility of these ambitions.In a brilliant post which attracted a stack of comments Jon Hickman, from the research centre of the Birmingham School of Media at Birmingham City University, looked at the problems of managing such a large project, and the possible solutions.
The Council’s response was that the site is for the residents and not the Twitterati. That comment alone shows the sheer lack of understanding of digital in general and social media in particular. Most of those commenting were residents, business owners or people working in the City. Twitter enabled us to discuss this, for people to air their views. Prior to the Internet, a story like this would have resulted in a pile of letters to the local newspaper. Now people can discuss and debate via a range of sites, blogs and of course Twitter.
The mass of concerned contributors points to the wealth of local - and national - professional goodwill which a local authority could draw on to improve its website if only it would get its head out of the sand (or its arse) and stop digging itself deeper into a hole of its own making.
Said Jon:
So if the root of the problem is the procurement process, what would I suggest instead? I’m going to be completely idealistic, and I do realise that there are quite huge barriers to this sort of change, but I have an answer. Here’s a manifesto for better procurement of web projects (it would work for IT projects and service design too):Amongst the fascinating comments Jake Grimley (Managing Director of Made Media Ltd. in Birmingham) said:
1. Let’s focus on needs and the original problem, not best guess briefs by well meaning non-experts.
2. Let’s be aware from the outset that your best guess cost is going to be wrong, and the project will be late.
3. Let’s allow for changes in a radical way: not just contingency days in the spec, or a change management process.
4. Let’s budget for innovation.
5. Let’s open up the project and let users shape the end result.
In my experience, the developers and project managers tend to care deeply about the problems they’re solving and feel that they are fighting a daily battle against organisational politics, third-party software vendors, and basic client-misunderstandings to get those problems solved. For that reason all websites are usually a compromise between the ideal, and the practicalities that have to be accepted in order to deliver.Another commentator was Brummie developer Stef Lewandowski:
And that last word is key. Steve Jobs said that ‘real artists ship’. Solving the problems in new and innovative ways is nice, but delivering *something that works* is more important. When we work with larger public sector organisations at Made, we sometimes find that delivery is not really at the core of the organisation’s ethos. For that reason it has to be us chasing the client, driving delivery rather than the other way around. That commitment to delivery takes sustained effort and joint experience, which is what would make me sceptical about your ‘hire a bunch of freelancers and give them six years’ approach.
Other than that, what you’re describing is similar to agile software development methodologies. These always appeal to programmers because they saves one from having to think it all through before you start coding. However this methodology is completely at odds with the classic public sector tendering system, which attempts to specify the project in its entirity before even speaking to prospective developers. But then, if you had £600K to spend, wouldn’t you want to know what you were going to get for it before you signed the contract? In addition, agile methodologies do not work well in large political organisations, because they rely on a client sponsor who’s not afraid to take quick decisions when presented with alternatives. Does that sound like Birmingham City Council to you?
[Solution?] This one’s quite simple I think:
Commission a specialist website development company with a track record of delivering complex websites to budget and to timescale, rather than – say – a generic IT outsourcing company.
The irony here is that the reality of the web team putting together the new BCC site is possibly uncannily similar to the ‘ideal’ process Jon outlined.
From experience of public sector tendering if you don’t match the brief fully in the pre-qualification stage then you have no chance of getting shortlisted. So the larger the project, the smaller the chance that the commissioned web company will have to influence the brief they are responding to?These comments show both the willingness of professionals to help but also the frustration and drawing away from government which the actions and comments of so many 'lets-pat-ourselves-on-the-back' bureaucrats like Glyn Evans provoke.
Lessons
What is the number one lesson I'd like readers in local government to take from this abject disaster? Rebellion.
Local government webbies did not cause this (as Stuart Harrison points out) and would not have made the series of bad decisions which led to the ship hitting the iceberg. Why? Because they're professional webbies and know what the heck they are doing.
This project was not run by them, it was outsourced by non-webbies to a company which is not about the web and doesn't live to build brilliant websites and which operates in an uncompetitive environment. Put simply, organisations like Crapita can get away with this and move on to the next excess and profits binge.
This disaster is the best argument for the new Public Sector Web Professionals organisation. Webbies need to develop the clout to call out those responsible for this and for the other council website disasters coming down the pipe.
Another point here is that local government website disasters know no party - all of them are responsible somewhere in the country for crap websites. In Birmingham it was Tories and LibDems, elsewhere it is Labour.
In Brum Deputy council leader Paul Tilsley, who has overall responsibility for 'business transformation', admitted that he was wary of switching on the revamped website.
He said he feared journalists would be logging on at 9am on the first morning in an attempt to identify any “glitches”. Little did he know it would be webbies (aka Evans' dismissive and patronisingly described 'Twitterati') demolishing his council's website. One can only hope some Brummies make this political and hold Tilsley's feet to the proverbial fire.
What unites those wanting change is not party but profession. Those webbies in Conservative HQ and Tories elsewhere need to link up with the rest of us to raise our collective status.
We all need to reach out professionally to those who have commented, those who have shown anger and interest in this disaster and raise our collective voices and shout 'NO MORE!' We need to get together and elbow the Glyn Evans' of this world of ours out of the way because we are citizens as well as webbies and as such we know we can do better than this crap served up to us as 'transformation'.
This episode should be used a an object lesson in how-not-to-do-it and it should mark a turning point for web professionals in local government.
Make it so. Yes we can. You have the power.
Tweet This! ♦ Add to del.icio.us ♦ DiggIt! ♦ Add to Reddit ♦ Stumble This ♦ Add to Google Bookmarks ♦ Add to Yahoo MyWeb ♦ Add to Technorati Faves ♦ Slashdot it ♦
How to write a good tweet
Image via Wikipedia
My guru Jakob Nielsen has published on research Nielsen Norman Group has just completed on Tweet usability.To my knowledge, no one else has done this sort of research although many of the recommendations have been picked up elsewhere and some are plain common sense (not that that often stops people ignoring sense ... !).
He walks us through their design of a promotional tweet, going through five iterations to end up with:
LAS VEGAS (October) and BERLIN (November): venues for our biggest usability conference ever http://bit.ly/UsabilityWeekHere are some points for this tweet's development (with some embellishment from me):
- Capitalising city names draws the eye, breaking up the quick scan
- Because when people scan they typically only read the first few words of a sentence, those first words need to be information-rich
- Promotional tweets can be ignored so include some sense of news /new to make them useful / less obviously promotional / more compelling
- Tweets should be 130 chars or less to allow for retweeting
- Full sentences aren't necessary in short content, which users are scanning, so ruthlessly chop unnecessary words and use quickly comprehensible characters like '+' and ':'
- Use a meaningful URL - which may appear elsewhere alone and out-of-context
- A tweet should be highly focused and not try to make multiple points
However for others than him it may be more useful to use a service like Twittertise (which allows you to schedule tweets) to overcome Twitter's ephemeral, stream driven nature (Nielsen says that with click-through decay, Twitter time passes ten times faster than email time) and hit more of your likely audience amongst your followers. Obviously don't over-egg this!
Finally he reiterates a point which really needs driving home in my experience as many people don't get it:
Text is a UI
and this applies doubly when you are talking about short text and when you're calling people to action.
Tweet This! ♦ Add to del.icio.us ♦ DiggIt! ♦ Add to Reddit ♦ Stumble This ♦ Add to Google Bookmarks ♦ Add to Yahoo MyWeb ♦ Add to Technorati Faves ♦ Slashdot it ♦
Africa is bloo*y enormous
This is a stunning graphic
HT: Owen Barder
Tweet This! ♦ Add to del.icio.us ♦ DiggIt! ♦ Add to Reddit ♦ Stumble This ♦ Add to Google Bookmarks ♦ Add to Yahoo MyWeb ♦ Add to Technorati Faves ♦ Slashdot it ♦
Tuesday, September 8
Repost: Fear the Google, don't fear the Google
I came back from a break (which included a fab visit to Wardman Towers) to the missed news that Google has settled out of court in its clash with US Publishers over its Google Books project.
Looking over last Friday's news coverage what immediately struck me as someone who has an interest in the story was the repetition that Google was the only one engaged in digitising books. Ben Cohen's Channel Four report even featured him at the British Library who are part of a competing book digitisation project!
I wrote about Google books on my blog in another context back in April and, reviewing the content, it seems appropriate to repost.
~~~~~~~~
There's been a lot of paranoia around lately about Google. We've had Street View arriving in the UK and provoking posh protests blocking their camera car. Now we've got requests for investigations on the info they hold about their users.
The former seems bizarre to me in the country which has accepted more cameras looking down on us than anywhere else in the world. The latter appears worrying when it's suggested that they might disclose that information to states, but states can get far more from internet service providers (and already trawl our digital use anyway).
What's really exercised some though has been the Google business proposition, which is to use that data to better target ads.
Here's why I find this a bit ridiculous: Google just aren't very good at it.
I have been using Gmail for ages and send and receive heaps of email. Besides each one are text ads targeted at me using all the text in all that email plus the content of the particular email I'm looking at.
They're consistently mistargeted .
I swear I don't know why they think I might want Lionel Richie tickets. or have a legal problem.
After hunting I found one which is vaguely near to the content of the email. But only vaguely.
The area where we should be the most concerned is the one which I've yet to see British media really pick up on. And it's actually concerns Google's mission - to organize the world's information and make it universally accessible and useful.
Google Books is their project to make available digital copies of out-of-copyright books and make copyright book text searchable.
They've signed up Oxford University amongst other big name partners.
Trouble is there are several rivals to Google and they're open-source, not proprietary. Services like PublicDomainReprints.org and the Internet Archive.
Recently Google changed it's terms to specifically disallow any of these services from using books they'd digitised - public domain books. There's not been any legal action thus far but why change the terms if they didn't want to challenge others, like the Internet Archive which hosts over half a million public domain books downloaded from Google.
Google has also 'locked up' some public domain books.
Here's an example of a public domain book on Google that was once 'Full access' and is now 'Snippet only': The American Historical Review, 1920. For the time being, there is a copy on Internet Archive.
The agreements with libraries (which are mainly university libraries), which were only made public by legal action, means that they give Google all of their books for free, and in return they are given scans that they effectively cannot use for anything.
If they want access to the corpus, they have to subscribe just like everyone else. This means that Google is requiring them to buy back their own copyrighted books, if anyone wants to actually use them on or off the campus.
Their recent deal with publishers which includes the setting up of a Books Rights Registry appears to give Google different, more favourable terms to anyone else who enters into agreements with the Registry.
The Open Content Alliance (OCA) is a consortium with the Internet Archive at its centre which wants to build (a virtual) Alexandria Library II (a physical Bibliotheca Alexandrina exists). The OCA includes the British Library, the Royal Botanic Gardens at Kew and a number of corporations - though neither Google nor Microsoft, who recently left it after funding the scanning of 750,000 books to launch their own book scanning project.
Brewster Kahle, who founded the Internet Archive and heads the Open Content Alliance, warns of "the consequences of the consolidation of information into the hands of a few private organizations".
Google is digitizing some great libraries. But their contracts (which were actually secret contracts with libraries – which is bizarre, but anyway, they were secret until they got sued out of them by some governments) are under such restrictions that they’re pretty useless... the copies that go back to the libraries. Pretty much Google is trying to set themselves up as the only place to get to these materials; the only library; the only access. The idea of having only one company control the library of human knowledge is a nightmare. I mean this is 1984 – a book about how bad the world would be if this really came about, if a few governments’ control and corporations’ control on information goes too far.There's other issues here too with Google's relationship with libraries:
The OCA is trying to establish a standard and both Google and now Microsoft have opted out. Not only is there duplication (triplication) of these vital efforts for human knowledge but Google also refuses to even talk to them, it sees them as a rival.Some may have second thoughts if Google’s system isn’t set up to recognize some of their digital copies, said Gregory Crane, a Tufts University professor who is currently studying the difficulty accessing some digital content.
For instance, Tufts worries Google’s optical reader won’t recognize some books written in classical Greek. If the same problem were to crop up with a digital book in the Open Content Alliance, Crane thinks it will be more easily addressed because the group is allowing outside access to the material.
The OCA are building a "permanent, publicly accessible archive" of digitized texts. Both Google and Microsoft are doing it to make money - not that there's anything wrong with that but it is right to fear when such knowledge is only available via corporate, proprietorial means.
Postscript: Thanks to Stefan Czerniawski for pointing me to this excellent piece by Doc Searls on the Google settlement.
I hope some journos are reading this because, thus far, the UK media's coverage of the Google settlement has been dire. And we're talking about an enormous, extremely significant issue here!
Tweet This! ♦ Add to del.icio.us ♦ DiggIt! ♦ Add to Reddit ♦ Stumble This ♦ Add to Google Bookmarks ♦ Add to Yahoo MyWeb ♦ Add to Technorati Faves ♦ Slashdot it ♦
What's the connection between asylum/migration and LGBT people?
Left Outside has a great post (following one by Carl Packman) which dissects the age-old anti-migration arguments :
Our modern debate on migration has not developed out of a vacuum. In fact, we are forced to watch tedious reruns of discussions concerning Huguenots in the 1680s, Irish migrants in the early 19th Century and Eastern Europeans in the late, Jews in the 1930s and West Indians and South Asians in the 1960 and 70s.He says that any anti-migration media piece will always contain one of the following: The Disloyal Immigrant; Soft Touch Britain™; Diseased and sex obsessed migrants; Criminal immigrants; Lump of Labour/Housing/Hospitals/Women Fallacy; or Swamped.
He uses these memes to play 'Immigrant Bingo'.
There's actually another thread of migration stories in papers like The Daily Hate (much more often in local newspapers) and that's pro ones.
Stories about someone in Shetland being defended by the locals, for example.
You don't see them very often but they do exist, these 'genuine asylum seekers' say these stories, and when talked about these cases usually elicit majority sympathy from, yes, even Daily Hate (etc) readers. I saw this with the 19yo gay Iranian Mehdi Kazemi. Even in comments on The Sun's website most people wanted him granted asylum.
What strikes me, as someone who works to defend LGBT asylum seekers, is the parallels with attitudes to LGBT people.
It is well-established that once people have a LGBT person in the family, or have a LGBT friend or work colleague, they are far less likely to support anti-gay law and be prejudiced. I'd suggest this holds true for migrants and asylum seekers too, if you know one you are less likely to want them deported. If you don't know one migrants and asylum seekers are just numbers, a homogeneous mass.
This is why groups like Amnesty, UNHCR and the Refugee Council run campaigns which aim to put a human face on those who are defamed in the media on a daily basis. These, along with those occasional news stories, show that it is possible to get people to show some basic humanity.
Tweet This! ♦ Add to del.icio.us ♦ DiggIt! ♦ Add to Reddit ♦ Stumble This ♦ Add to Google Bookmarks ♦ Add to Yahoo MyWeb ♦ Add to Technorati Faves ♦ Slashdot it ♦