How I'd Update the GCSE in Computer Science

This is how I'd improve the GCSE in Computer Science - why not produce your own list?

It's a common question in Computing - how would you change the curriculum? There's recently been some suggestion that there should be a new GCSE, called Computing, which is more similar to the KS3 National Curriculum. A common criticism is that the current GCSE is not "relevant" and doesn't contain enough ICT, but I would argue that it might be better to go the other way - that making it further removed from the computer would make it more relevant.

Overview

My view is that school is for education, rather than vocational training or exam preparation, and no-one has ever suggested that students will need knowledge of polders in the Zuider Zee or the Cold War in their future employment. Equally, not everyone uses a "standard" set of applications (e.g. word processing, spreadsheet and presentation software) in their work. Not everyone requires knowledge of the fetch-execute cycle in their work either, but most people might need to sort, search and organise things efficiently in a variety of contexts.

I'd aim to make it more obvious that a GCSE in Computer Science is not just preparation for working in the software industry, but is about being efficient and organised in your approach to "life", with examples of how computer science approaches are used elsewhere (as we used to do with "ICT in Medicine", "ICT in Education", etc., in the old ICT GCSE).

Questions in GCSE Maths, for example, attempt to link abstract concepts to "real life" scenarios, using recipes, special offers and school trips. Could we not do the same with Computer Science? John needs to sort the books in the library, but how is he going to do it without getting them all off the shelves?

Not everyone will use everything from the GCSE - just as not everyone uses all of maths - but you don't know what you'll need or when. A common question in Maths lessons is apparently "When will I need algebra?", even though it's something that people use all of the time, without realising - e.g. to compare deals in the supermarket.

We need a more coherent, more "relevant" GCSE that moves students on from KS3. In many ways they are currently standing still - programming, representation, binary and Booolean logic are all covered at KS3, for example - and in some ways they're moving backwards because students are required to use two programming languages at KS3 and only one for GCSE. I've always thought that it was better to master one language, but some exposure to other languages - especially ones that include things like arrays - wouldn't hurt.

I'd say that there are three big areas in the current GCSE:

I would keep and expand these and focus on links between them, so although my proposal might look like more content, it could be fewer explicit concepts.

What I'd Remove

Each generation has its share of declarative knowledge that is neither useful nor interesting. For my cohort in the 80s it was the relative merits of daisy-wheel and chain printers.

I think that, for current students, it's still mostly hardware detail:

I'd make the point that GCSEs are for the flavour of a subject and A level is for the detail. A more important point is that just because something isn't in the specification, it doesn't mean that you can't teach it anyway.

What I'd Add or Expand

I'd rearrange the remaining content into themes as described above, i.e. representation, mathematical logic and algorithms/programming.

In order to make it clearer that computer science isn't only about computers, but thinking about processes so that we can get computers to do them, I would expand the current sections and emphasise connections between them to make their relevance clearer.

Topic To Include...

Representation and Storage

  • how programs are stored - to make it easier to understand what compilers and interpreters are doing
  • what text files can be used for, e.g. CSV or HTML
  • other ways of representing text, images and sound - e.g. vector/midi with links to non-computing applications such as music boxes, knitting patterns and Morse code
  • maybe ASCII art (with compression) and the nature of Morse code could be used to link to compression and efficiency
  • Huffman coding and/or RLE (and explicit links to colour depth) to make it clearer how compression actually works.
  • explicit links between binary, Boolean values, circuits and storage media to show that they're really all just ways of representing binary values
  • possibly HTML as an example of a text file, but also to link to compression (e.g. image file format), networking and linking vs. embedding

Maths for Computing

  • EOR/XOR - it's useful due to its reversibility and for links to encryption (e.g. two-way light switches)
  • adder circuits could be used to link together binary addition, logic circuits and hardware in the ALU - it's not really an addition to the specification as logic circuits are already included; it's just a specified example.
  • bitwise logic - just a combination of binary and Boolean logic (which are already covered), but it's a useful technique and can be used to link programming and conversion of denary to binary.
  • an explicit section on the use of permutations and combinations in computing, e.g. password strength, number of values stored in n bits, colour depth, the bus width, file size, number of rows in a truth table, etc.
  • only mention the fetch-execute cycle to discuss what a general purpose computer is - with links to devices with a processor that aren't programmable (e.g. calculators) and devices that take input for control only, e.g. Jacquard looms.
  • replace protocols and layers with transmission, errors and parity - e.g. parallel/serial and parity (e.g. with use of card flip magic)
  • discussion of what "random" and "serial" access mean, e.g. in RAM
  • expanded section on number bases - I think that a greater number of examples might help students to better understand the idea of place value and the links to multiplying or dividing by powers of ten in maths. I would include decimals in binary and hexadecimal - and possibly octal. We would discuss the separation of the concept of a number from its representation, e.g. with tally, Roman numerals, foreign languages, etc.
  • simple graph theory - e.g. GCSE Maths style questions on shortest/cheapest/quickest route - which could also be used as an introduction to networking (but only in terms of what a network is and why it's used)

Algorithms and Programming

  • explicit mention of the word efficiency in the specification, with discussions of variable (re-)use, minimising repetition, simplifying calculations, etc.
  • discussion of different types of user interface - e.g. how certain types of form control can reduce the need for validation
  • a defined set of practice tasks for consistency of experience (like OCR Coding challenges, but simpler)
  • selection sort (as it's easy to code) for links to programming, and link merge sort to parallel processing
  • promotion of practical examples, e.g. sorting books - you wouldn't use a merge sort because you'd have to take them all off the shelves, whereas an insertion sort would work well - or using a binary search to find the time that something went missing in CCTV footage, rather than watching the whole thing.
  • the idea that sorting and searching go together, e.g. if you've only got ten books then it's probably quicker to do a linear search than a sort followed by a binary search, but if you work in a library it's worth keeping the books in order.
  • explicit mention of databases - e.g. the current OCR specification mentions fields, records and queries, and requires SQL, but doesn't mention the word "database"
  • discussion of other data structures - e.g. binary trees/branching databases (which could be used for the OCR Coding Challenges "Classification" task)
  • cookies as a storage mechanism

This blog was originally written in October 2024.