The origins of this post came from a question I was asking myself: Why is Twitter, a ludicrously low-bandwidth service, popular now and not in 2000 or 2004? And at the same time I read a lot of posts about how Web 2.0 was going to turn our brains to goo. This post attempts to tackle both topics.
It seems you can’t move right now for excitement about Twitter and other forms of microcontent creation. Everything else is following Twitter’s “go small, go often” philosophy as well – just look at Facebook’s new lifestream interface. And with Twitter dominating current web thinking comes fears of digital overload, warnings of destroyed attention spans (as Maggie Jackson argues in Wired), or even ludicrous claims that it will make you ill. On the other hand, this “snack culture” could just be handled by our brains just fine, ultimately making us mentally fitter.
If we’re truly living in a snack culture, how come so many forms of entertainment – TV shows, games, movies – are getting longer? Most of us, I suspect, have had this experience lately: You tell a friend that they simply have to start watching one of the new long-format dramas, like Heroes or The Wire.
Right now, the movie everybody is going on about on the web is Watchmen, which clocks in at an impressive 2 hours 42 minutes, and itself an adaptation (for better or worse) of one of the most complex and layered graphic novels of the 20th century, the only such book to make TIME’s 100 novels of all-time, one held in almost impossible high esteem by the exact same demographic (geeks) that also love to be on Twitter and other Web 2.0 sites.
Watchmen (and the other examples Johnson cites and expounds upon in Everything Bad Is Good For You) show that when consuming media, depth and brevity are not totally irreconcilable; you can concentrate on something difficult and concrete as well as enjoying content 140 characters at a time. And yet Twitter often gets demonised as a posterboy for the inanity of Web 2.0. Perhaps that’s no surprise, with its chief characteristics of brevity and ephemerality, the exact opposite of how we have consumed media in the past. Given that “value” of old media was often measured on its length (writers being paid by the word) or durability (all those books and records on your shelves), what’s produced in new media is often characterised as comparatively worthless, particularly by those who cut their teeth in the old media.
It’s tempting to say this was inevitable, the Internet coming in and destroying business models, rational thought, the very fabric of the universe as we know it. Yet Twitter didn’t just become popular because of all that free bandwidth, server power and cute pictures of whales. In fact, we were technically capable of producing Twitter five years ago, with broadband adoption relatively high among early adopters (geeks), and Twitter’s paltry bandwidth needs being easily handled by WAP or SMS on mobile. But it arose for two reasons: firstly, because people wanted to take part in this communal self-expression – giving us the so-called “adrenaline shot” of information. But we are not rats being stimulated by electrodes at a lab assistant’s will – Twitter is popular not just because we wanted to take part, but because we are now able to as well.
What if the vast rise in information volume and variety was not an inevitable result of the digital landscape, but rather a consequence of how we behaved socially and digitally? That as the “early adopters” – the bloggers and wiki editors and geeks – in the mid-2000s got more used to dealing with information – continually writing, creating, photographing, rating, tagging, chatting, texting, poking, etc. – and improved their brevity & clarity, that they have become more efficient at producing information? And once they became more efficient, they sought (or created) channels and media that better exploited that efficiency.
I’ll lead from my own personal experience here. Looking over this blog now and comparing it with 2004, you’ll see I write a lot less than I used to. What thought would have taken me an entire blog post back then, I can now Tweet once, or post to Delicious with a little extended commentary. How? Because I got better at communicating more briefly, and what used to take a blog post now can take a linklog post or a Tweet.
This can even improve the quality of one’s writing – once you’re confined to 140 characters (often less, if there are hashtags or a username to put in too) – and in a world where a lot of people are Tweeting, you have to make it more interesting, not less, to stand out from the crowd. Extraneous words, useless adjectives and sucking up have to get the boot, if you want to fit what you’re saying in succinctly. This is the exact opposite of growing sentences with David Foster Wallace, and while it may send fans of dense prose into despair, it does on the other hand encourage more simplicity and less pretension.
Of course, not everything on Twitter is gold – let’s face it, a lot of it is shit – but then again, a lot of everything is shit. But those of us who are fans of Twitter are comfortable with the levels of shit and have filtering mechanisms that can deal with them, so that we can find the stuff we really do enjoy. This comes back to what Clay Shirky called “filter failure“: the idea that “information overload” is not the problem – it has been a problem since the invention of the printing press – just that our filters have been radically realigned and are now the responsibility of consumers rather than publishers.
The key is this – that early adopters have not become distracted to death by consuming services such as microblogging, because we only moved onto it once we’d become publishers ourselves, mastering producing information online through other forms of media. By doing so, immersing ourselves in those information flows and working out what works and what doesn’t, we became better consumers of information as well – although that took a little time; hence why Twitter lagged several years behind the rise in blogging.
Discussion here is dominated by Twitter, but for “Twitter”, read anything new, distracting and disturbing in Web 2.0 that is emerging or will emerge in the near future, the same principle should hold – we choose to adopt them because we are able to as well as because we’re willing. This can appear puzzling to the outsider, and the only way to join in (if you want to join in) is to do what the geeks did, to produce and consume more information to help train up one’s filters. As I have discussed before, we will still need some technological help in the filtering process. But no technology can relieve the brain of all of its filtering burden. It’s somewhat ironic that the best way of preventing information overload is to indulge in even more of it – but who said anything in this world was that straightforward?