Why it’s not just about teaching kids to code

The Guardian have launched a Digital Literacy Campaign, led by an article entitled “Britain’s computer science courses failing to give workers digital skills“:

In higher education, although universities such as Bournemouth are praised by employers for working closely with industry, other universities and colleges have been criticised by businesses for running a significant number of “dead-end” courses in computer science, with poor prospects of employment for those enrolled.

And from my own anecdotal experience, that’s correct. For one reason or another, I’ve been reviewing CVs and interviewing people at work for developer roles last couple of months, and some of them were awful. They tended to have degrees or other qualifications from mid- and lower-tier universities and colleges, but had trouble telling the difference between PHP and JavaScript code, or were unable to provide even stock answers to well-versed problems such as sorting.

(Feel free to call me out as a snob on this one; I read my Bachelor’s in Computer Science at Cambridge, one of the few universities in this country where the majority of the course is spent not coding)

Anecdotal though my own experience, and many of the quotes in the article are, the Guardian’s campaign is laudable and I back teaching kids code in schools. But there are two issues I have with the campaign – it’s not just teaching, and not just code that needs to be taught (or learned).

Firstly, “digital literacy” is as broad a term as “literacy” or “numeracy”, and there are a range of different issues at stake. Take this complaint in the above article:

Ian Wright, the chief engineer for vehicle dynamics with the Mercedes AMG Petronas Formula One team, said: “There’s definitely a shortage of the right people. What we’ve found is that somebody spot on in terms of the maths can’t do the software; if they’re spot on in terms of the software, they can’t do the maths.


Kim Blake, the events and education co-ordinator for Blitz Games Studios, said: “We do really struggle to recruit in some areas; the problem is often not the number of people applying, which can be quite high, but the quality of their work. We accept that it might take a while to find a really good Android programmer or motion graphics artist, as these are specialist roles which have emerged relatively recently – but this year it took us several months to recruit a front-end web developer. Surely those sorts of skills have been around for nearly a decade now?


In a highly critical report last month, school inspectors warned that too many information and communication technology (ICT) teachers had limited knowledge of key skills such as computer programming. In half of all secondary schools, the level many school leavers reach in ICT is so low they would not be able to go on to advanced study, Ofsted said.

Computer Science is not Programming, and Programming is not Web Development, and Web Development is not ICT. What we have is a whole spectrum of different demands and of different roles, all of which have technology in common but often little else; producing computer models for a Formula One team or CGI Studio is going to demand a PhD-level or near grasp of maths or physics, combined with knowledge of highly specialised programming. Developing a front-end for a website still demands a reasonable degree of intelligence, but also a wider knowledge of languages and coding, and a better appreciation of more subjective issues such as usability, browser standards (or the lack of them) and aesthetics. Meanwhile, being adept with ICT doesn’t mean you have to be a genius or be an expert in code, but it needs to be more than how to make a PowerPoint presentation; how to use a computer properly and not just by rote, how to be confident in manipulating and understanding data, how to automate tedious tasks, how to creatively solve a problem.

Today technology is integrated to our lives to a quite frankly frightening degree. Should that mean everyone has to learn how to code? No. Should it mean everyone have an understanding of the basics, an appreciation of what computers can and can’t do, and the ability to use that knowledge to solve problems by themselves? Yes. But making everyone code is not the answer, and to me the Guardian is taking a bit of a “if it looks like a nail” approach to the problem of digital illiteracy.

That said, from my experience of the graduate CVs I read, the teaching of coding, as a practice, does need to improve. University courses should be better assessed and monitored and the “sausage factories” closed. Teaching how to code should be integrated into related subjects such as maths and physics wherever possible (and it’s worth noting many places do this well already). It shouldn’t just be coding that is taught, but how to define a problem, to break it down, and solve it. If anything, that’s more important – programming languages and technologies change all the time (e.g. how many Flash developers do you think will be about in five years’ time?) but the problems usually remain the same.

Secondly, there’s a spectrum of challenges, but there’s also a spectrum of solutions. It’s not just schools and universities that need to bear the burden. As I said, coding is a practice. There’s only so much that can be taught; an incredible amount of my knowledge comes from experience. Practical projects and exercises in school or university are essential, but from my experience, none of that can beat having to do it for real. Whether it’s for a living, or in your spare time (coding your own site, or taking part in an Open Source project), the moment your code is being used in the real world and real people are bitching about it or praising it, you get a better appreciation of what the task involves.

So it’s not just universities and schools that need to improve their schooling if we want to produce better coders. Employers should take a more open-minded approach to training staff to code – those that are keen and capable – even if it’s not part of their core competence. Technology providers should make it easier to code on their computers and operating systems out-of-the-box. Geeks need to be more open-minded and accommodating to interested beginners, and to build more approachable tools like Codecademy. Culturally, we need to be treat coding less like some dark art or the preserve of a select few.

On that last point, the Guardian is to be applauded for barrier-breaking, for making the topic a little less mysterious and for engaging with it in a way I’ve seen precious little of from any other media outlet. And the page on how to teach code is a great start – it should really be called how to learn code, because it’s a collection of really useful resources. For what it’s worth, I wrote a blog post nearly three years ago on things on things to get started on – though if I wrote it today I would probably drop the tip on regular expressions (what was I thinking?).

If I had one last thing to add, is that all of the Guardian’s campaign, and the support from Government, is framed around coding for work. Which is important – we are in the economic doldrums and the UK cannot afford to fall behind other nations. But, at the same time, the first code a beginner writes is going to be crap, and not very useful. Even when they get to a moderately competent level, it won’t be very useful beyond the unique task it was built for. Making really good code that is reusable and resilient is bloody hard work, and it would be off-putting to make the beginner judge themselves against that standard.

We need to talk a lot more about why we code as well as how we code. I don’t code for coding’s sake, or just because I can make a living out of it. I code because it’s fun solving problems, it’s fun making broken things work, it’s fun creating new things. Take the fun out of it, making it merely a “transferrable skill” for economic advantage, will suck the joy out of it just like management-speak sucks the joy out of writing. It doesn’t have to be like that. Emphasise on the fun, emphasise the joy of making the infernal machine do something you didn’t think it was possible to do, encourage the “Isn’t that cool?” or “Doesn’t that make life easier?”. Get the fun bit right first, and the useful bit will follow right after.

5 thoughts on “Why it’s not just about teaching kids to code

  1. Strikes me there should be some rhetoric overlap with arguments for Latin in schools – in that any particular application education is likely to be obsolete in relatively short order, but there’s lots of benefit to be gained from people understanding the general principles and from the world-viewing opening de-mystification of the whole thing.

  2. “though if I wrote it today I would probably drop the tip on regular expressions (what was I thinking?)”

    Actually, I’d say they’re still very important. There’s plenty of times where I’ve got partway into coding some small, repetitive task, only to have it dawn on me that that part could be replaced by a fairly short, well-crafted regex.

    Sure, they might be a little complex, but they’re supported well in the vast majority of programming languages (at least those that are still in use), and they’re the best solution in more cases than you might initially think.

    (Plus, if you end up using one of the several POSIX-y systems as your primary OS, there are plenty of very useful utilities that make heavy use of regex (or very similar) syntax, such as grep and sed, to name just 2.)

  3. Hey guys – thanks for the comments

    @Psychedelic Squid – Yes, regular expressions are important, but perhaps a little daunting for the beginner, and it’s very easy to get bogged down in a confusing mess. I’d put them as a “second-tier” thing to learn and understand, along with how to do some basic POSIX-y tasks from the command line. In its place I’d probably introduce some gentler data extraction tools like ScraperWiki that weren’t around three years ago…

  4. Whilst you are correct that digital literacy is a broad term and that there are a range of different issues at stake, there is a common root to the different problems you’ve identified and that is a lack of programming experience taught in schools.

    Once young people realise the many different applications that a computer can be put to with a bit of coding (incl. all those mentioned in your post) and have the confidence to experiment themselves, they can develop their own skills, using the myriad resources available. (Many freely on the internet.) That knowledge is essential for all the myriad roles computer scientists might find themselves taking.

    Many kids lack either the knowledge or confidence. They will forever be consigned either to being basic users of other people’s applications, or try and learn this stuff later, and struggle to e.g. distinguish Javascript from PHP… This is only getting worse as companies like Apple draw a sharper and sharper distinction between consumers and programmers. (E.g. making OS X looking more and more like iOS.)

    This makes digital literacy a lot like numeracy – with kids who don’t leave school numerically literate forever disadvantaged. Just as with numeracy, schools have an essential role teaching the basics, and when it comes to computer sciences that does mean writing and running code.

  5. @Chris, @Psychedelic Squid: Whenever I’m asked how to automate some data reformatting task, the answer is usually a very simple s/foo (.*) bar/$1/ or similar. Regular expressions are as complex as you make them, and they’re very easy to learn incrementally. Even a very basic knowledge can be an extremely tool, and you can add extra features to your repertoire as and when you need them.

    Teaching people to read any arbitrary regex is not going to go well though; of course there will always be regexes which are utterly cryptic (I still find this and I’ve been writing them for over a decade).

No new comments may be added.