The Guardian have launched a Digital Literacy Campaign, led by an article entitled “Britain’s computer science courses failing to give workers digital skills“:
In higher education, although universities such as Bournemouth are praised by employers for working closely with industry, other universities and colleges have been criticised by businesses for running a significant number of “dead-end” courses in computer science, with poor prospects of employment for those enrolled.
(Feel free to call me out as a snob on this one; I read my Bachelor’s in Computer Science at Cambridge, one of the few universities in this country where the majority of the course is spent not coding)
Anecdotal though my own experience, and many of the quotes in the article are, the Guardian’s campaign is laudable and I back teaching kids code in schools. But there are two issues I have with the campaign – it’s not just teaching, and not just code that needs to be taught (or learned).
Firstly, “digital literacy” is as broad a term as “literacy” or “numeracy”, and there are a range of different issues at stake. Take this complaint in the above article:
Ian Wright, the chief engineer for vehicle dynamics with the Mercedes AMG Petronas Formula One team, said: “There’s definitely a shortage of the right people. What we’ve found is that somebody spot on in terms of the maths can’t do the software; if they’re spot on in terms of the software, they can’t do the maths.
Kim Blake, the events and education co-ordinator for Blitz Games Studios, said: “We do really struggle to recruit in some areas; the problem is often not the number of people applying, which can be quite high, but the quality of their work. We accept that it might take a while to find a really good Android programmer or motion graphics artist, as these are specialist roles which have emerged relatively recently – but this year it took us several months to recruit a front-end web developer. Surely those sorts of skills have been around for nearly a decade now?
In a highly critical report last month, school inspectors warned that too many information and communication technology (ICT) teachers had limited knowledge of key skills such as computer programming. In half of all secondary schools, the level many school leavers reach in ICT is so low they would not be able to go on to advanced study, Ofsted said.
Computer Science is not Programming, and Programming is not Web Development, and Web Development is not ICT. What we have is a whole spectrum of different demands and of different roles, all of which have technology in common but often little else; producing computer models for a Formula One team or CGI Studio is going to demand a PhD-level or near grasp of maths or physics, combined with knowledge of highly specialised programming. Developing a front-end for a website still demands a reasonable degree of intelligence, but also a wider knowledge of languages and coding, and a better appreciation of more subjective issues such as usability, browser standards (or the lack of them) and aesthetics. Meanwhile, being adept with ICT doesn’t mean you have to be a genius or be an expert in code, but it needs to be more than how to make a PowerPoint presentation; how to use a computer properly and not just by rote, how to be confident in manipulating and understanding data, how to automate tedious tasks, how to creatively solve a problem.
Today technology is integrated to our lives to a quite frankly frightening degree. Should that mean everyone has to learn how to code? No. Should it mean everyone have an understanding of the basics, an appreciation of what computers can and can’t do, and the ability to use that knowledge to solve problems by themselves? Yes. But making everyone code is not the answer, and to me the Guardian is taking a bit of a “if it looks like a nail” approach to the problem of digital illiteracy.
That said, from my experience of the graduate CVs I read, the teaching of coding, as a practice, does need to improve. University courses should be better assessed and monitored and the “sausage factories” closed. Teaching how to code should be integrated into related subjects such as maths and physics wherever possible (and it’s worth noting many places do this well already). It shouldn’t just be coding that is taught, but how to define a problem, to break it down, and solve it. If anything, that’s more important – programming languages and technologies change all the time (e.g. how many Flash developers do you think will be about in five years’ time?) but the problems usually remain the same.
Secondly, there’s a spectrum of challenges, but there’s also a spectrum of solutions. It’s not just schools and universities that need to bear the burden. As I said, coding is a practice. There’s only so much that can be taught; an incredible amount of my knowledge comes from experience. Practical projects and exercises in school or university are essential, but from my experience, none of that can beat having to do it for real. Whether it’s for a living, or in your spare time (coding your own site, or taking part in an Open Source project), the moment your code is being used in the real world and real people are bitching about it or praising it, you get a better appreciation of what the task involves.
So it’s not just universities and schools that need to improve their schooling if we want to produce better coders. Employers should take a more open-minded approach to training staff to code – those that are keen and capable – even if it’s not part of their core competence. Technology providers should make it easier to code on their computers and operating systems out-of-the-box. Geeks need to be more open-minded and accommodating to interested beginners, and to build more approachable tools like Codecademy. Culturally, we need to be treat coding less like some dark art or the preserve of a select few.
On that last point, the Guardian is to be applauded for barrier-breaking, for making the topic a little less mysterious and for engaging with it in a way I’ve seen precious little of from any other media outlet. And the page on how to teach code is a great start – it should really be called how to learn code, because it’s a collection of really useful resources. For what it’s worth, I wrote a blog post nearly three years ago on things on things to get started on – though if I wrote it today I would probably drop the tip on regular expressions (what was I thinking?).
If I had one last thing to add, is that all of the Guardian’s campaign, and the support from Government, is framed around coding for work. Which is important – we are in the economic doldrums and the UK cannot afford to fall behind other nations. But, at the same time, the first code a beginner writes is going to be crap, and not very useful. Even when they get to a moderately competent level, it won’t be very useful beyond the unique task it was built for. Making really good code that is reusable and resilient is bloody hard work, and it would be off-putting to make the beginner judge themselves against that standard.
We need to talk a lot more about why we code as well as how we code. I don’t code for coding’s sake, or just because I can make a living out of it. I code because it’s fun solving problems, it’s fun making broken things work, it’s fun creating new things. Take the fun out of it, making it merely a “transferrable skill” for economic advantage, will suck the joy out of it just like management-speak sucks the joy out of writing. It doesn’t have to be like that. Emphasise on the fun, emphasise the joy of making the infernal machine do something you didn’t think it was possible to do, encourage the “Isn’t that cool?” or “Doesn’t that make life easier?”. Get the fun bit right first, and the useful bit will follow right after.