UK announces AI funding for teachers: how this technology could change the profession
Prostock-studio/Shutterstock
During the recent international AI Safety Summit held in the UK, the government announced a further £2 million to be invested in Oak National Academy – a publicly funded classroom resource hub – to develop artificial intelligence tools to help reduce teachers’ workloads.
Generative AI, such as Open AI’s ChatGPT, responds to prompts from users to produce content. It has become a hot topic in education.
While there isn’t much up-to-date research on how teachers are using AI, we know from our work with schools that teachers are experimenting with AI to create lesson plans, classroom resources and schemes of work. For example, a teacher might ask ChatGPT, “make me a lesson plan on river flooding in Tewkesbury for year seven”. Within seconds, a plan will be available containing learning objectives, materials, activities, homework, assessments and more.
Technology giants Google and Microsoft, as well as established education technology platforms such as Khan Academy, are promoting their AI offerings to schools.
Start-ups and smaller operations are also getting in on the action, many of them promising time-saving tools that can take on much of the planning, thinking and feedback that happens before and after classes.
Why is AI attractive?
There are two influential factors explaining the take-up of AI by teachers.
One is workload. Burn-out and stress in the teaching profession is a key reason 41% of all teachers are planning to quit within five years.
Having lesson plans, handouts and student reports available in seconds is alluring. For government ministers who have long promised teachers reduced workoads and better working conditions, AI seems to offer a tangible and affordable answer.
Teacher supply is also important. Headteachers are facing significant challenges finding enough teachers, as a result of increasing pupil numbers as well as teachers quitting and low numbers joining the profession. The idea of AI to support teachers leading classes in short-staffed subjects may be particularly appealing.
AI-designed lessons could source expert content in fields such as mathematics or physics, two subjects with low recruitment numbers. These lessons could arguably be more pertinent and accurate than lessons that would otherwise be designed by non-specialists.
There are clear potential benefits in terms of time saving and access to subject-related content. But how these tools might affect the teaching profession more broadly needs to be considered.
Devaluing teachers
Teaching is widely recognised as an intellectual endeavour. But lesson plans produced by generative AI have no educational or disciplinary expertise of their own. They simply build sequences of plausible content, working from material in the data sets they have been trained on, in conjunction with prompts from users. In this respect, they are recycled expertise.
Their use may weaken the place of scholarship in teaching. Teachers may find themselves acting as executive technicians – circulating worksheets and managing behaviour – rather than considering deeper questions on what should be taught, how, and the moral concerns of education.
What’s more, a move toward teachers as technicians is unlikely to attract high quality graduates – or make our education system the envy of the world.
In using AI to reduce teacher workload there is also a risk that the needs of specific groups of students, and their contexts, will be ignored.
Take curriculum and lesson design. Teachers consider a wide range of factors when developing a curriculum and individual lessons within it. Ideally, they incorporate a strong sense of the subject knowledge and skills to be learned, as well as taking their particular class of students and the context of the lessons into account. This may well be lost in AI-produced lessons.
There is also the question of how teaching expertise is developed and maintained. Generative AI models have a tendency to make up new facts and sources, which is sometimes referred to as “hallucinating”. The content they produce can also be biased and discriminatory.
This means that any content created by AI must be critically reviewed. But if the “thinking work” of lesson and curriculum planning is outsourced to AI tools, and “enacted” by enthusiastic but non-specialist teachers, no one is accountable for the quality, safety and relevance of these materials.
If AI becomes routine, teachers may not develop the skills needed to critically evaluate and adapt AI-generated lessons and activities for the students in front of them.
Someone (or something) else doing the educational thinking, with a “presenting person” in front of the students, may free teachers from the burden of planning and assessment. But we must think carefully before mathematics, physics, or any other subject we have deemed important and relevant to our existence and civilisation are reduced to a diluted and potentially misrepresented version.
If there is money from the government to invest in education, this should go directly to the most important resource in any classroom – the teacher.
The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.