Have you ever watched your pupils writing on an iPad? For their generation though, this seems almost natural, with many developing a muscle memory for the ‘keys’, and others making use of built in tools to make text-entry that bit easier, including speech recognition and swipe based keyboards. More interesting still is the way many pupils opt for the machine generated suggestions offered as predictive text above the keyboards: it seems possible for many pupils to create quite lengthy stories or essays mostly by choosing one of three words offered as a suggestion.
The technology that powers predictive text is similar to, but far less powerful than, that which powers ChatGPT, Bard and other large language model generative artificial intelligence. Broadly, these work by deciding what’s the best word (or ‘token’) to pick next, based on the prompts received and all the words it’s responded with so far. There’s a bit more to it than that, as much of its character is determined by the semi-randomness (‘temperature’) of the text generated. The probabilities of what token might come next are based on the large body of text from the open web on which it has been trained. I don’t think there’s any strong sense in which ChatGPT understands these texts, although it might appear to do so. Nor has it any real sense of which sources are more reliable than others, although it does have a sense of what’s ‘normal’ and what’s not. It’s also been trained to be polite and to avoid swearing, and its developers have built in a number of ‘safety’ features to prevent it from generating text that might be offensive or inappropriate.
Many of the teachers and trainee teachers with whom I’ve worked have tried ChatGPT for themselves, and have found a whole range of relevant use cases: creating medium term plans and lesson plans, identifying the content to cover in a presentation, creating multi-choice quizzes, and a mark scheme, producing model answers for exam questions and even providing code for analysing pupil attainment data.
For pupils though, I think it’s already possible to see ChatGPT fulfilling some (but by no means all) of the roles of a teacher or teaching assistant. Vygotsky wrote about learning happening in the zone of proximal development, where the learner is just beyond the point where they can do something on their own, but not so far beyond that they can’t be helped by a more experienced person. Traditionally we’d see teachers as the ‘more knowledgeable other’, but we’ve come to recognise that fellow pupils can take on this role, and I think we’re now seeing the emergence of ChatGPT as a new kind of ‘more knowledgeable other’, one that can be accessed at any time, and one that can be used to help with a wide range of tasks. Under 13s can’t create their own accounts for ChatGPT, and those between 13 and 18 should only do so with parental consent. At the moment, it’s a text-based interface, so pupils’ literacy skills need to be already at a good level before it’s accessible to them, although it can be used in combination with text to the speech and speech to text accessibility features built in to modern operating systems.
Whilst it’s easy enough for pupils to get started using Chat GPT for themselves, making effective use of it is a skill that can be both taught and learnt. Using Chat GPT well is about coming up with good ‘prompts’ for the AI - the information the pupils provides it with or more typically the question you ask it. I think there’s a parallel here with our role as educators - our learners learn best when we ask good questions or provide a good stimulus as a starting point for their response, and indeed some of the best interactions with Chat GPT can have something in common with great dialogues between teachers and their pupils.
Chat GPT is very good at coming up with answers to most homework questions - it’ll provide answers to comprehension questions if given the source text, or write to a given length on most given topics. Crafting the prompt so that it includes information given in the lesson or from the textbook can help provide an answer closer to what might be expected. It’s also pretty good at creative writing, including poetry. Pupils can ask it to rewrite text in the style of say, an eleven year old, or more usefully to explain or elaborate particular points. It’s less good at maths, at least at the time of writing, but I’m sure this will get better soon.
I worry about pupils using it to do their work for them, as this seems to defeat the educational objective of the task. Teaching pupils how they can use Chat GPT to help them learn seems more useful: it’s good at suggesting ideas for an essay or a story, and it can help make improvements to something which pupils have already written. It’s also good at explaining something, and these explanations can be phrased in language appropriate for younger readers, or expanded. It can also create revision or practice questions on topics, and will engage in a Socratic dialogue if asked to.
Pupils’ use of Chat GPT raises ethical issues - the primary and secondary pupils I’ve spoken to about it are generally clear that it’s fine to ask it to explain things, to ask it for ideas, or to get help improving something they’ve written, but it crosses an ethical line to pass off its work as their own. This is pretty much the same as the same way as they’d think about advice from their peers, parents or tutors.
Arthur C. Clarke suggested that ‘any teacher who can be replaced by a machine, should be’. I think (hope), that we’re a long way from that, but for a motivated, literate and connected learner, Chat GPT can be a powerful tool for their learning. As teachers, I think we need to be aware of how pupils can and will use these systems, and offer our own ‘prompts’ to pupils to help them use them positively.
Originally published in Sapientia, the ICT for Education newsletterShare