Distracting ourselves to attention

17 March 2009

The origins of this post came from a question I was asking myself: Why is Twitter, a ludicrously low-bandwidth service, popular now and not in 2000 or 2004? And at the same time I read a lot of posts about how Web 2.0 was going to turn our brains to goo. This post attempts to tackle both topics.

It seems you can’t move right now for excitement about Twitter and other forms of microcontent creation. Everything else is following Twitter’s “go small, go often” philosophy as well – just look at Facebook’s new lifestream interface. And with Twitter dominating current web thinking comes fears of digital overload, warnings of destroyed attention spans (as Maggie Jackson argues in Wired), or even ludicrous claims that it will make you ill. On the other hand, this “snack culture” could just be handled by our brains just fine, ultimately making us mentally fitter.

One problem: these are the wrong questions to be asking. The dichotomy of “brief and shallow” media vs. “long and deep” media is a false one. As Steven Berlin Johnson argued last year:

If we’re truly living in a snack culture, how come so many forms of entertainment – TV shows, games, movies – are getting longer? Most of us, I suspect, have had this experience lately: You tell a friend that they simply have to start watching one of the new long-format dramas, like Heroes or The Wire.

Right now, the movie everybody is going on about on the web is Watchmen, which clocks in at an impressive 2 hours 42 minutes, and itself an adaptation (for better or worse) of one of the most complex and layered graphic novels of the 20th century, the only such book to make TIME’s 100 novels of all-time, one held in almost impossible high esteem by the exact same demographic (geeks) that also love to be on Twitter and other Web 2.0 sites.

Watchmen (and the other examples Johnson cites and expounds upon in Everything Bad Is Good For You) show that when consuming media, depth and brevity are not totally irreconcilable; you can concentrate on something difficult and concrete as well as enjoying content 140 characters at a time. And yet Twitter often gets demonised as a posterboy for the inanity of Web 2.0. Perhaps that’s no surprise, with its chief characteristics of brevity and ephemerality, the exact opposite of how we have consumed media in the past. Given that “value” of old media was often measured on its length (writers being paid by the word) or durability (all those books and records on your shelves), what’s produced in new media is often characterised as comparatively worthless, particularly by those who cut their teeth in the old media.

It’s tempting to say this was inevitable, the Internet coming in and destroying business models, rational thought, the very fabric of the universe as we know it. Yet Twitter didn’t just become popular because of all that free bandwidth, server power and cute pictures of whales. In fact, we were technically capable of producing Twitter five years ago, with broadband adoption relatively high among early adopters (geeks), and Twitter’s paltry bandwidth needs being easily handled by WAP or SMS on mobile. But it arose for two reasons: firstly, because people wanted to take part in this communal self-expression – giving us the so-called “adrenaline shot” of information. But we are not rats being stimulated by electrodes at a lab assistant’s will – Twitter is popular not just because we wanted to take part, but because we are now able to as well.

What if the vast rise in information volume and variety was not an inevitable result of the digital landscape, but rather a consequence of how we behaved socially and digitally? That as the “early adopters” – the bloggers and wiki editors and geeks – in the mid-2000s got more used to dealing with information – continually writing, creating, photographing, rating, tagging, chatting, texting, poking, etc. – and improved their brevity & clarity, that they have become more efficient at producing information? And once they became more efficient, they sought (or created) channels and media that better exploited that efficiency.

I’ll lead from my own personal experience here. Looking over this blog now and comparing it with 2004, you’ll see I write a lot less than I used to. What thought would have taken me an entire blog post back then, I can now Tweet once, or post to Delicious with a little extended commentary. How? Because I got better at communicating more briefly, and what used to take a blog post now can take a linklog post or a Tweet.

This can even improve the quality of one’s writing – once you’re confined to 140 characters (often less, if there are hashtags or a username to put in too) – and in a world where a lot of people are Tweeting, you have to make it more interesting, not less, to stand out from the crowd. Extraneous words, useless adjectives and sucking up have to get the boot, if you want to fit what you’re saying in succinctly. This is the exact opposite of growing sentences with David Foster Wallace, and while it may send fans of dense prose into despair, it does on the other hand encourage more simplicity and less pretension.

Of course, not everything on Twitter is gold – let’s face it, a lot of it is shit – but then again, a lot of everything is shit. But those of us who are fans of Twitter are comfortable with the levels of shit and have filtering mechanisms that can deal with them, so that we can find the stuff we really do enjoy. This comes back to what Clay Shirky called “filter failure“: the idea that “information overload” is not the problem – it has been a problem since the invention of the printing press – just that our filters have been radically realigned and are now the responsibility of consumers rather than publishers.

The key is this – that early adopters have not become distracted to death by consuming services such as microblogging, because we only moved onto it once we’d become publishers ourselves, mastering producing information online through other forms of media. By doing so, immersing ourselves in those information flows and working out what works and what doesn’t, we became better consumers of information as well – although that took a little time; hence why Twitter lagged several years behind the rise in blogging.

Discussion here is dominated by Twitter, but for “Twitter”, read anything new, distracting and disturbing in Web 2.0 that is emerging or will emerge in the near future, the same principle should hold – we choose to adopt them because we are able to as well as because we’re willing. This can appear puzzling to the outsider, and the only way to join in (if you want to join in) is to do what the geeks did, to produce and consume more information to help train up one’s filters. As I have discussed before, we will still need some technological help in the filtering process. But no technology can relieve the brain of all of its filtering burden. It’s somewhat ironic that the best way of preventing information overload is to indulge in even more of it – but who said anything in this world was that straightforward?


8 Responses

Chris – you nailed it. This is absolutely spot on. Bravo! : ) (I would previously have written a gushing review on several blogs, but I think I’ll Tweet about it instead. This, as you say, is just *better* communication… and my brain won’t give me cancer in the process)

This harks back to Odlyzko’s “Content is not King”. WAP was supposed to be the killer app and SMS was an afterthought. Which one do people use most?

Twitter is the SMS of Web 2.0. It’s clear people really want to talk and listen to each other more than strategists seem to understand. Makes me laugh.

All good points. I do still wonder abut the fragmentation of attention.
I know my typical day is now splintered into sub-minute shards and it wasn’t always this way.
Can’t help but think all these new instant media are part of the reason.

I also wonder if the reduction to 140 characters doesn’t create the illusion of community and communication, replacing the real thing. Let’s face it, my ‘followers’ aren’t really following me at all. Nine out of ten times, I’m tweeting into the void.

I don’t doubt that something real is going on here and something valuable being born. But I can’t help but feel there’s a price. Just haven’t figured out what it is and how costly.

“I?ll lead from my own personal experience here. Looking over this blog now and comparing it with 2004, you?ll see I write a lot less than I used to. What thought would have taken me an entire blog post back then, I can now Tweet once, or post to Delicious with a little extended commentary. How? Because I got better at communicating more briefly, and what used to take a blog post now can take a linklog post or a Tweet.”

Is it really possible to compose a Tweet which conveys as much information and opinion, as this whole blog posting does ?

Is it really possible to compose a Tweet which conveys as much information and opinion, as this whole blog posting does?

No – and I wasn’t advocating abandoning all blogging in favour of just Tweeting or linklogging (my recent absence from blogging notwithstanding)- just that a lot of content I used to push onto the blog that is now better suited to those other media. Blog posts still have a very important role to play for longer and more drawn out thinking, of course.

Great post. Managed to condense all the points and issues that are raised. Indeed a lot of everything is shit, just look at TV and magazines. Communication comes in many shapes and sizes. Because I read twitter doesn’t mean I don’t read books, magazines or newspapers (oh hang on, I don’t read newspapers, but that is another conversation).

This has themes is common with something Alex Harrowell said ages ago concerning the creation of a more hostile memetic environment.

In order to become smarter as a civilization we need to be more open to ideas and have more of them but at the same time be much more critical and sceptical of all the ideas we are exposed to.

IOW we need to do as you suggest and become more practiced at filtering out the nonsense and finding the gold.

Great post.

[...] has a thoughtful piece on the reasons Twitter has taken off a few years after the rise of blogging, and why [...]