Scapegoating the Computing Curriculum

Why are reports in the media blaming the new Computing curriculum for a lack of "digital skills", diversity and gender balance?

Ever since Computing replaced ICT in the National Curriculum there have been discussions about whether students will have the skills required by employers, or how they might acquire those skills. A recent study has been reported as suggesting that there is a “digital skills gap”, with fewer students getting “vital” skills.

That article appears to be a little confused; it starts off by saying that students lack the "digital" skills that employers want, but then most of the text is about the lack of diversity in students taking Computer Science as a GCSE.

For me, it also raises a number of further questions:

  1. What are these "digital" skills that employers think are lacking in our students? There's no mention of anything specific.
  2. ICT was compulsory in many schools, which may well explain why the uptake of Computer Science looks poor in comparison; do we know what percentage of students took GCSE ICT in schools where it was optional, for example? And do we know what proportion of students were girls in schools where ICT was optional? Some schools also entered students for two ICT qualifications at the same time; do the figures take that into account?
  3. More fundamentally, are schools really there to provide vocational training? Many people struggle to find a cleaner; should cleaning be added to the curriculum?

Computing and Computer Science seem to be getting the blame, but are they really the cause of the decline in “digital” skills, or are they being made a scapegoat by non-specialist teachers who don’t want to teach those subjects? Haven’t employers always complained about lack of skills?

Almost half of young people go on to higher education, so the first point we could make in Computing’s defence is that, if half of the newly-employed young people who lack the skills are graduates, they will have been 16 at least five years ago – i.e. when ICT was still a National Curriculum subject.

IT is always in a state of change, and I think that there’s a danger of conflating a number of issues.

When I first started teaching, back in 1997, not all students had a computer at home, and using a computer at school was a novelty. Students had lots to learn, and they were enthusiastic about ICT lessons. I should also point out that the ICT National Curriculum always contained a programming element (called Developing ideas and making things happen), and my first ever lesson was teaching functions in BASIC to a year 9 class.

In the early days of home computers, Bill Gates said that it was his dream for the PC to become just another household appliance, like a television or fridge. Fast-forward a few years and that dream became reality. By the early noughties it was unusual for a household not to have a computer. Paradoxically, though, I started to notice a decline in students’ IT skills. My assumption was that if you’re already familiar with something at home then you don’t want to be told how to use it when you get to school; you don’t want to be taught how to use your computer any more than you want to be taught how to use your fridge. So I would estimate that “digital skills” had already been in decline for a decade before Computing was added to the National Curriculum.

It was at about this time that the KS3 Strategy was introduced in response to concerns over the lack of skilled teachers to deliver the ICT curriculum. It wasn’t compulsory, but gave the impression that where we were previously teaching relational databases, we could now get students to review web-pages and talk about the difference between a fact and an opinion instead. It’s at this point that a lot of teachers who are resistant to teaching Computing appear to have joined us. It’s also the point at which portfolio-based qualifications, such as DiDA and OCR Nationals, were introduced, removing the need for students to master any ICT skills.

I often get to see students’ own computers, and in the last ten years I’ve started to notice that those computers contain fewer and fewer files that were actually created by the students themselves. It wasn’t that uncommon for a student’s My Documents folder to be completely empty because the laptop was only used for social media, YouTube and games.

It made sense, therefore, for these students to switch to other devices, and we’ve seen a proliferation in the number of alternative platforms – phones, tablets, Chromebooks, etc. Once again it’s not unusual for households not to have a PC, and even for students to only use iPads in primary school. In the Autumn term a year 7 student told me that he’d never used a computer before.

When I asked in a forum what “digital skills” an employer might think was lacking in our students, one suggestion was keyboard/mouse skills and filing. iPads don’t have keyboards or mice, and have no accessible filing system. Typing is a skill, and speed comes with practice. If a school has one KS3 lesson per week and we remove exposition time and any activities that don’t require a lot of typing (spreadsheets, editing images, etc.), then students might type for an average of maybe 15 minutes per week – that’s not enough to master a skill without practice at home.

But what skills are employers really looking for? I’d imagine that after a higher education course most candidates would be able to type at a reasonable speed. They might want “transferrable skills”, such as copying and pasting, but I’d be surprised if most students couldn’t do that by the time they left school.

Most jobs require a limited range of skills, and possibly the use of bespoke systems. This may only require a small amount of training; if you employ someone that doesn't have the spreadsheet skills you require, but is otherwise suitable for the job, you can probably show them what to do quite quickly.

The same cannot be said of algorithmic thinking skills. Critics of Computing say that programming is just a niche subject for people who want to work in the software industry, but they’re missing the point. A program is just the measurable outcome of a thought process, and it’s the thinking that we want to encourage.

Programming is about problem solving and algorithmic thinking, and more people need algorithms than need spreadsheets; more people sort things or plan routes than create "multimedia products" in PowerPoint. Computer Science is relevant to people who don't even use computers, such as the Travelling Salesman and the Chinese Postman. In their book, Algorithms To Live By, Brian Christian and Tom Griffiths even go one step further and say that computer science principles are not only useful for sorting our CDs, but that thinking like a computer scientist can help with the big decisions in our lives.

I think that the Computing curriculum is a massive improvement over the ICT one that preceded, but I know that not everyone agrees. If you get frustrated that students can’t type or name their files properly, though, make sure that you’re not getting swayed by your own confirmation bias and desire not to learn what object-oriented programming is, and ask whether there are factors at play other than the curriculum.

This blog was originally written for the TES Subject Genius series in June 2018.