Sorry… not much traffic in either the main blog or the linklog likely the next few days… moving house, temporary lack of broadband, work, G8… gimme a few days for normal service to resume…
A Monday lunchtime free-thinking, stream of consciousness sort of blog post follows…
Eugh. Moving house is crap. Official. Especially dealing with settling all manner of bills, involving Herculean battles against the “Computer says no…” culture inherent in many of our utility companies (“Can you tell me my final balance please?” “Nope, sorry” “Not even an estimate?” “No, we don’t do that” “But…but…you’re the accounts and billing department!” “Sorry…it will take several days for the system to process” “What the fuck are you running your computers on, steam? Monkeys with abacuses? I’ve just given you the meter readings, I could work out my gas bill right now with pen and paper, you hapless moron” – alright, I didn’t actually say that last bit, but I did think it).
“Computer says no” is possibly my favourite set of Little Britain sketches (aside – I don’t know what it is, but I don’t like Little Britain as much as I did the Fast Show, which is essentially the same setup of a series of recurring characters with absurd mannerisms and familiar catchphrases – the Fast Show was somehow funnier and more endearing) possibly because it is the most true, in the social conscious. But in actual matter it’s completely false; computers don’t ‘say’ anything, they’re merely reflecting what we’ve told them to do. They obey the code we write (bugs in the compiler not withstanding). The code we write reflects the way we approach problems; thus code is as well as being a technology, is a literary device for free expression of our thoughts. There are, within the vast family of programming languages, many different literary styles and ideas of what is ‘good’ and what is ‘bad’ code. Essentially, code is literature.
So, although Richard Stallman’s denunciation of software patents in last week’s Guardian, with its comparisons to “what if 19th Century literature was subject to the same rules?” may seem tenuous to begin with, and Stallman could have maybe done a better job of explaining it, the essential point he makes is quite valid. Of course, many people won’t appreciate the link, and possibly deride it as in intrusion of technology into art, for the same reasons why “computer says no” is often taken as a undefeatable mantra; the idea that technologies are not at all human; they have their own separate realm which they rule – whenever ordinary people intrude, we become subject to it. This is wrong, and leaves the implication we should try to maintain this false division to preserve ourselves; this view is promoted both by some technologists, to further cement their position as high priests, the few people in command of the technology, and by neo-Luddites and art snobs, who both deny that technology should be regarded as human artistic process, like many other activities.
The trick is to expand the definition of “we” in the “they obey the code we write” – at the moment “we” is all too often only applies to those with the ability to write the code directly. The Open Source/Free Software movement is a good initial stab in the right direction; although most of its projects still involve an ?lite of programmers, the central philosophy that code is there for everyone to change and shape is its most uplifting and optimistic feature, and sets the movement’s principle actors well apart from the tech high priests. The next step is not just getting ‘ordinary’ users to download and use the products of OSS such as Linux or Firefox, but to realise that they can participate as well. That doesn’t just mean creating new tools, and possibly new languages and concepts that don’t require lofty expertise, to broaden participation on the technological side, but also breaking down the false wall between “people” and “computers” on the social side too. In short, we need to have people start saying “yes”, rather than letting computers say “no”.
This looks a bit wank, and my reasoning has holes in it, but I’ll publish it anyway. How the hell did I get from my gas bill to the philosophy of technology via Little Britain and Richard Stallman? This could perhaps make the ‘most tenuous linking of subjects in blog post’ award of the year, if such a thing exists…
While I’m stuck in a library all day writing out more notes than is either humanly possible or morally right, I have noted with great disappointment how my IM contacts list has dropped from usually 10-20 people to below half a dozen ; most likely because loads of them have gone to enjoy Glastonbury, the fuckers. Therefore I can’t feel anything but a wry bit of joy when the BBC pops up headlines like Flooding causes Glastonbury chaos. Hahaha. Does this make me a bad person?
Update: The ‘glastonbury’ tag on Flickr, with lots of mobile phone shots, is a nice semi-“live” feed of the latest conditions…
Further update: Tom and I have been throughly enjoying the, er, photostream. This photo, of the stranded Portaloos while a festival-goer really starts to regret that last cider, is my current favourite.
The story of a Welsh AM demonstrating a crimebusting drugs-testing machine in a publicity stunt, then promptly testing positive for cannabis herself (via fridgemagnet), is at first amusing. But it becomes quite worrying, when the explanation is not that she’s been enjoying the green, but instead that it came from cross-contamination, such as: “touching cash or a door handle previously handled by a drugs user.” Which immediately raises the question – if it’s so prone to false positives, why the hell are we letting the police use it? At least it cannot be used in court, but can still be used by the police to track & target innocent people on the flimsiest of supposed evidence.
G.K. Chesterton once said, on the use of capitalism to cure social ills, that:
“It was the mystical dogma of Bentham and Adam Smith and the rest, that some of the worst of human passions would turn out to be all for the best. It was the mysterious doctrine that selfishness would do the work of unselfishness.”
I was mindful of those comments when, via Nick Cohen in the Observer, I browsed the website of a new book, Rip-Off!, a long-needed expos? of the wanton avarice within the management consultancy industry from an insider, with particular reference to the ‘reform’ of public services; the case of an NHS trust being foisted with an American consultant, and being expected to cover not just his fees but the costs of moving him across the Atlantic, and the expense of housing his entire family and schooling his children, is shocking, especially as he was no more special than any British consultant that could have been hired.
For too long the consultancy industry has often been able to charge exorbitant fees for the same advice (which most of the time can be boiled down to cut costs, fire people) the client would have got if they’d kept the operation in-house, with little public coverage. With a few honourable execptions such as Private Eye and the odd Guardian investigation into PFI health and education projects – both a result of New Labour’s thrall to the consultant’s spiel – there is little scrutiny of such murky dealing. Given that it costs the taxpayer ?1.9bn a year to pay for consultants’ advice, this is pretty shameful. This book is a welcome arrival. The first chapter is available free as a PDF, and is highly recommended reading; my order for the rest of the book is in the post…
About 20 years ago a new kind of bike started appearing on British streets: the mountain bike. Where did it come from? Not from a lone inventor working in his shed, experimenting feverishly. Not from the research and development lab of a mainstream bike manufacturer. The mountain bike came from users, especially a group of young enthusiasts in California who were frustrated that they could not ride along mountain trails on racing bikes.
There’s all kinds of idiocy in that first paragraph, from a simplistic black-and-white distinction between “users” and “inventors”, to a further promotion of the myth of the lone inventor, that there ever has been a time when all creative activity was done by lone individuals acting ex nihilo (which followed, and was shaped by, the similarly flawed concept of the single Romantic author, the solitary original creative genius – but that’s a story for another day). There’s also an implicit myth that bicycles in general were, pre-mountain bike era, unchanged and static, when in fact they have been evolving and shaped, not just by the manufacturers but by the users from the day they first appeared on the market (I have rambled on about this in the past) .
Then, to make things worse, the article flat-out contradicts itself:
The first commercial mountain bike came out in 1982, and the big bike manufacturers piled in. By the mid-Eighties, 15 years after the users had developed the first mountain bike, it was a staple of the mainstream market. In 2000, mountain bikes accounted for 65 per cent of bike sales in the US.
Before going on to assert:
We are moving from an era of mass production to one of mass innovation.
Hang on – if we’re leaving the era of mass production, where have all those bikes come from?
Anyway, the “total bollocks” alarms really start to go off when he starts talking about eBay:
In 1995 only about 122 people were trading on the forerunner of EBay. Now there are 122 million. EBay’s growth is in large part due to putting easy-to-use tools in the hands of users and letting them trade together. EBay charges for providing a platform and the tools. Users are free to do much as they like with them. EBay, as a firm, is sustained by its users who provide much of the innovation.
Now, while it’s true that eBay does offer a wide variety of tools, they’re not exactly that flexible; they’re closed-source and designed to constrain users’ behaviour from doing anything too dangerous (which is understandable, given the sensitive information eBay holds, giving users free remit to play with its inner workings might be a bad idea); while they offer an API for some real ‘power use’, it’s still under their terms, and very few regular eBay users are going to have the time or ability to exploit it.
The article then tries to unify both the rise of Linux, and of internet-coordinated campaigns like Jubilee 2000, into one big theory of where the world is going, shifting towards a paradigm of user/individual-oriented innovation. But, all four examples are fundamentally different; a physical technology, a closed digital technology-cum-marketplace, an open digital technology and a political campaign all have their own unique sets of politics and economics. Grand over-reaching theories of everything rarely satisfy, either in the natural sciences or the social sciences.
The thing is, this is a real shame, because the middle bit of the article is spot-on, at least as far as technology is concerned:
We will need to rethink deep-seated notions. We like to think innovation comes in a flash of genius and insight – a eureka moment – to an individual who is the author of the idea. Our patent system is based on the idea that the individual inventor can say in advance what their invention is for. […] We have come to think that all creativity resides in the special people and places, the home of the creative class: the designer in his studio, the boffin in the lab, the geek in the garage, the bohemians wandering the cultural quarters of our leading cities.
But the mistake he makes is to think this a recent phenomenon; historical social studies of technology show this to be far from the case – even in the mass-produced, industrialised age, users, well, some users, have always tinkered, modified and amalgamated technologies into new configurations (there’s some good stuff done on this by Williams and Fleck, amongst others). There is the question of whether a greater proportion of users are now doing so, because of better education, more information available to us, more free time etc. (I believe this to be the case), but this does not mean it did not exist in the past; there has always been a human instinct to create, and creation nearly always involves reusing the established work of others like this.
But there are obstacles to this creation; some domains by their very nature are more flexible and pliable than others – while it easy for people to customise their copy of The Sims, this cannot be extended to the treatment of diabetes (this actually is his example). Innovation in medicine is far riskier and carries a markedly different set of values to innovation in gaming. Issue of expertise, cost, measuring outcomes, power and ethics all muddy the water.
A departure from the flawed paradigm of the lone genius, the solitary inventor towards the idea of a remix culture (and the enormous value of a commons of knowledge for creators to use) and of innovation as a shared group activity is a good thing, and deserves to be given more treatment. But to gain any true understanding out of it, we need to apply not just to the ideas of the future, but also the lessons from the past.
As a footnote, the article also defends Hilary Cottam, who won Designer of the Year for the design of a school, a project which she wasn’t directly involved with; thus she’s been criticised for taking someone else’s credit. The defence is that she may not have done the actual design, but she deserves it for encouraging the users of the school in designing it misses the point; if we are going to recognise the collaborative and shared nature of innovation, then we should just get rid of awards like Designer of the Year.