AI and SEND

Feb 22, 2018

Miles Berry

There are a number of ways in which limited AI can help pupils with special educational needs and disabilities (SEND) access school, and few would be surprised if future developments did not results in further affordances to support inclusion and accessibility.

The ‘expert system’ approach to AI, and more recent work in machine learning, has been applied with a degree of success to diagnosis of some special educational needs (SEN), including specific language impairment, attention deficit, and dyslexia and related difficulties. At least one machine learning based tutoring systems appears able to integrate dyslexia identification into routine activity presentation and assessment. Such approaches might allow schools to identify children for professional diagnosis by educational psychologists more reliably than simply using teachers’ or parents’ judgement.

Text to speech applications, although typically rule based rather than powered by AI, can make it possible for visually impaired pupils to access a far broader range of texts than would formerly have been possible, and can also be of benefit to other children for whom reading is a challenge, such as those with dyslexia. More recent applications of AI allow images, including live camera feeds, to be described in text, or speech.

Speech to text has advanced rapidly in recent years, and many are now familiar with tools such as Siri, Alexa and Google Assistant. Automatic transcription of spoken language is increasingly accurate: hearing impaired pupils can access YouTube videos through automated captions, and a live transcript of a teacher’s introduction to a topic can be made available to hearing impaired pupils and those who would benefit from being able to look back over what the teacher had said. Similarly, pupils themselves can use speech recognition to ‘type’ answers to questions, stories or essays; this can also be of benefit for pupils with dyslexia or visual impairment. The corpus on which these systems are trained include relatively little speech by young children, and thus results are currently rather less accurate than for adult speech.

Whilst not a SEN, pupils for whom English is a second language can use fully automatic machine translation to follow instructions and access lesson content and texts. It also allows them to participate in the lesson and to ask questions of their teacher. Image and speech recognition can be used to provide translation between sign language and written or spoken text, although most of this work has been conducted in American Sign Language rather than British Sign Language at present.

Tools as simple as spelling and grammar checkers can be used by all pupils to correct errors in their work, but may be particularly useful for pupils with dyslexia: it’s not clear whether such tools help pupils to learn correct spelling and grammar, but it’s possible that some application of machine learning alongside the data generated by these tools would help. Pupils, including those with dyslexia, may also benefit from AI based text simplification tools in order to access complex texts.

AI based automatic, personalised tutoring systems might be particularly useful for pupils with SEN, where the usual path through a course of study may not be ideal - with enough prior data, machine learning algorithms might well be adept at tailoring a sequence of learning activities more appropriately for an individual learner than a teacher would be. Such tools may be particularly helpful for pupils with attention deficit or autism spectrum disorders who might find the demands of a typical classroom environment unconducive to study. By analogy with GPS-based navigation software: not all journeys start from the same place, and not all traffic can follow the same route, even if the destination is the same.

AI has been used to support pupils with Autism spectrum disorders (ASD): human-like robots can be programmed to behave in a predictable way, making interactions less threatening for pupils with ASD, and helping them develop some mental model of how the robot will react, perhaps making it a little easier to construct a theory of mind for human interaction. There are reports of children with ASD developing conversation skills through interaction with Siri, and subsequently applying these to interactions with people.

Submitted at their lordships’ request as written evidence for the House of Lords AI Select Committee