AI - What's All the Fuss About?

Is AI the future of education, or is it all just marketing puff?

I'd rather have knobs and dials than a screen in my car, but the motoring trend that bothers me most is automatic headlights. Turning on lights isn't much of a chore, and it's important for a driver to be aware of their surroundings. Removing the need to think about whether headlights are required means that we're now much more likely to see unlit cars driving along motorways in the pouring rain.

I'm also not a big fan of parking sensors. They cause confusion by crying wolf (especially in the 60s multi-storey car park at work) and sounding when it's obvious that you're nowhere near anything - I don't need to know when I'm 30cm from something, I need to know when I'm about to touch it.

My guess, however, would be that car manufacturers did their research and were told that people wanted these things. But do people really know what they want? And is what they want really best for them?

A little knowledge is a dangerous thing, apparently, and it turns out that partial automation doesn't make vehicles safer. As Hannah Fry says in Hello World, cars will get more dangerous the less often they need intervention. If you only have to take over the controls every couple of years, then you'll become complacent.

My reservation about the use of "artificial intelligence" in education is just the same - that it takes away the need to think about the things that we should be thinking about.

Sticking with the car theme... about 30 years ago I was looking for a new car stereo (as you did in those days, when they were a standard size and were interchangeable) - probably with a new-fangled CD player. I went into a shop and the salesman pointed to one and said excitedly, "Look - when you turn this one on, a dolphin swims across the display!". That's a bit like how I feel when I see a lot of demonstrations of "artificial intelligence" in education.

Before we get any further, I should point out that I'm not a luddite and I'm not in denial - about nine years ago I took Stanford's machine learning MOOC to see whether I could use some of the techniques to relieve my colleagues and me of the more onerous teaching chores ("No" was the short answer - we don't have enough data for training). I've also looked at the NCCE AI course.

I can see that AI can be "helpful" in some settings - using the Magic Eraser on my phone, for example, is much easier than using clone tools in image-editing software if you want to remove an unwanted element from a photograph.

I can also see why you might want to tell students, in Computing lessons, about machine learning as an alternative to "the algorithm" for solving a particular task, and its implications for things like the Data Protection Act (i.e. data controllers having to explain how automated decisions are made), but is just using an LLM, or generating an image, Computer Science? I don't think so.

According to a recent article, most secondary teachers are uninterested in AI. How can we tell whether they're uninterested or whether they're making an active choice not to use "AI"?

We see lots of stories about "AI" in the news - it decreases productivity, "Inbreeding" could cause the "collapse" of LLMs, people devalue its competence (although not its advice!) and both that we should be panicking about it and "ChatGPT is bullshit".

My concern about "AI" in education is not that students use it to cheat and then regret it, or even that it appears to improve performance in students who use it, until you take it away and then they perform worse than students who never had access. Nor is it the learned helplessness that results in using things like Google Maps.

I'm not proposing that we give up technology, as they're doing in some Finnish schools (even though the OECD says that computers do not improve results). While some people believe that we stopped evolving when we invented tools, our IQs increased throughout the 20th century as a result of using technology to communicate ideas.

There is evidence that assistive technologies are detrimental to progress. I noted in a previous blog that in one experiment, subjects were required to solve a logic puzzle - some using a software tool that only allowed them to make the necessary moves, and others using software that suggested possible next moves:

"In the early stages of solving the puzzle, the group using the helpful software made correct moves more quickly than the other group, as would be expected. But as the test proceeded, the proficiency of the members of the group using the bare-bones software increased more rapidly. In the end, those using the unhelpful program were able to solve the puzzle more quickly and with fewer wrong moves."

But it's not even that. No, my concern about "AI" is the desire to replace the things that are the core skills of being a teacher, and that we really should be doing ourselves, e.g.:

Why is it important that we do those things ourselves?

I've recently finished reading Exam Nation: Why Our Obsession with Grades Fails Everyone - and a Better Way to Think About School by Sammy Wright. He has the following to say about "AI":

"Currently the most significant aspect of AI for schools and jobs is its ability to create convincing written text - if reports and verbal analysis can be created digitally, where do the traditional skills of literacy come in? The answer to this one is actually a lot simpler. If history has taught us anything, it is that the new rarely completely supersedes the old. Just because we have cars doesn't mean we don't need to walk.

...more importantly... just because you don't need to do something doesn't mean there is no worth in doing it. In many ways the reason we still need to write essays is the central thesis of this book: process, not outcome, is what matters. You don't write in order to have an essay, nice and neat and filed away - you write in order to have written an essay, and to have reordered and structured your thoughts along the way."

This latter part reminded me of exactly the sort of conversations we've been having recently about programming - I've heard people arguing that we no longer need to teach programming because AI can do it for us. I disagree, for the reason given above.

In the current education climate, where schools appear to value results more than education, it's easy to see why teachers value outcome over process. It's become increasingly fashionable in the past couple of decades for schools to buy a scheme or work rather than to write one, for example. It's also easy to see that teachers who shortcut the process and buy the outcome are missing things - because they don't think about the course as a whole, so they don't have the same overview and understanding of the course to see the linking themes (e.g. combinations and truthiness).

For each of the following things, which is more important - the process or the outcome?

Finally, there's the issue of deskilling to consider. If you're not doing any planning, resource creation or marking, are you meeting any of the teachers' standards, and can you justify your UPS salary?

It's difficult to decide what we should tell the students - or for them to decide what they want to do when they leave school because, when you're a teenager, the job you end up doing might not yet exist.

We change our minds about things all the time. I was an enthusiastic early adopter of DAB, for example, when it was sold on sound-quality grounds, but then went off it as it got more compressed (in both senses of the word). Conversely I didn't like the artefacts in digital television when it first appeared, but now I've got used to it. As for streamed music - I still can't make up my mind about that, and won't be getting rid of my records any time soon.

It's early days for "AI", though, and someone or something might change my mind before I retire. Maybe I'm just waiting for that killer application? Or maybe - as there are many on-line articles saying that AI has reached a dead end - that killer application will never come.

PS. It's not just me. As I was finishing this page I found another article, called AI in education is a public problem, which outlines twenty-one arguments against the use of AI in education.

This blog was originally written in December 2024.