AI in initial teacher training

Mar 06, 2024

Miles Berry

As new technologies are developed, there’s often a feeling that the nature of education changes: Plato famously had Socrates bemoan the invention of writing as it would lead to forgetting and the mere appearance of wisdom. The printing press made learning accessible to those outside monasteries and universities, and the web opened up publishing to the masses, for good or ill. And yet, the fundamental nature of education has remained the same: it’s about the transmission of knowledge, the development of skills, and the formation of character. I hope I can thus be forgiven for being sceptical that Large Language Models such as GPT-4 will revolutionize education, although, like writing, the printing press, and the web, they will certainly change it.

The main part of my work at the University of Roehampton is training the next generation of teachers, and so I’ve a duty to prepare them as best as I can for working life in which AI, including generative AI looks set to play a role. Teacher training is heavily regulated, and the requirements for qualified teacher status, the Teachers’ Standards, have not been updated since 2011: unlike the previous version, these standards make no reference to technology. Nor does the current version of the ITT core content framework, the DfE’s curriculum for teacher training, although the next version will - making it clear that even new teachers need to know about the technologies that improve pupils outcomes, including particularly for supporting those with SEND.

Teacher training students need to develop their capability to use generative AI wisely and effectively in their roles as students, as teachers and in how they deal with pupils’ own use of these technologies.

For our PGCE or undergraduate students, my colleagues and I have been concerned most with the temptation for academic misconduct, submitting for assessment work done my AI rather than students themselves. We’ve policies in place of course, and have emphasised that unattributed and unauthorised use of generative AI would be treated like any other sort of plagiarism. I’ve also been keen to emphasise that our essays and other assignments are a means to learning rather than merely assessment of it, and that can only take place when students are consciously engaged in writing. Through discussion with students and colleagues across the University, we’ve established a helpful rule of thumb for using AI in academic work: if you could reasonably expect this support from peers or tutors, then it’s probably OK to use the AIs too. Thus, it’s OK to get Chat GPT to explain or summarise a text, to offer some ideas to get started, and to suggest how a piece of work could be improved, but it’s definitely not OK to get it to do the work for you - copying and pasting it’s output is almost certainly wrong.

As beginning teachers, I want my students to get all the help they can in keeping the administrative workload of teaching as light as possible, so they can spend more time teaching and inspiring their pupils. We explore how generative AI can help with lesson plans, creating resources, creating assessment items, and adapting content to better meet the needs of their pupils. They know they need to be critical of the output from generative AI, due to issues such as ‘hallucination’ and bias, but I’d hope they’d be critical of planning, teaching and assessment materials from other sources too!

For their pupils, I’d want my students to be able to pass on some understanding of how these tools actually work - to realise that whilst LLMs are very well read and very articulate, they have no understanding, reasoning or sentience. I’d like my students to be able to empower their pupils to use these technologies effectively: having the knowledge they need to be able to prompt well, and to evaluate the AI’s response. I’d also hope they talk to their pupils about the value of working hard to understand something themselves, and to develop the qualities that make them uniquely human, such as kindness, courage and creativity.

Originally published in the ISC Bulletin Issue 46