Kevin Oldershaw, Academic Director at Queen Ethelburga’s Collegiate

In anticipation of the release of the spring 2024 issue, which takes a look at technology’s role in teaching and how best to prepare pupils for an increasingly computer-based world, Kevin Oldershaw – Academic Director at Queen Ethelburga’s Collegiate – addresses the ‘electronic’ elephant in the classroom…

Artificial Intelligence (AI) is here to stay. Schools cannot afford to ignore it and, indeed, are doing students a disservice if they do not properly discuss and agree their fundamental approach to its use. As such, it is important to write an AI policy – there are many samples and templates out there to borrow from – though of course the school needs to first decide just how much they wish to embrace the potential of AI.

At QE, our approach has very much been to see AI as a tool to enhance and support learners and teachers. Why wouldn’t we? It is already so embedded in daily life in different forms, whether it be driverless cars, facial recognition or suggested viewing on streaming platforms, that to not make use of it seems counterintuitive. Not only this, but as AI is increasingly being utilised in a range of careers including early diagnoses of medical conditions, analysing market trends or making hiring decisions, we feel obligated to increase student exposure to the potential of AI. They will be entering a very different employment landscape to even 10 years ago and we owe it to them to prepare them as best we can for their futures – isn’t that our fundamental purpose as educators?

So, given this approach, we wrote a policy (helped by an AI model of course – they are good for that!) outlining to staff not only how AI can be used by teachers, leaders and support staff but how this must be done in a responsible way, considering issues around ethics, data privacy and bias. This is the same message that was then repeated to students in a series of assemblies last September, delivered in an age-appropriate way and reflecting the different safeguarding guidelines for those under 13, those from 13 to 18, and those over 18. There has been an emphasis with all students under 18 to use AI models that do not require a log in and to ensure they appreciate they should not be entering personal identifiable information into any website, including an AI model, if they do not know how it will be used. For the younger students, the approach has very much been to work with their teachers in lessons using AI as a class, but for the older ones, the focus is on encouraging them to experiment with how it can help them, in a controlled and managed environment.

As we moved through the year and as we, other educators and governing bodies began to get to grips with how AI can and is being used by staff and students, and various guidance began to be published and updated, we produced our own documentation for students helping them to better understand when it is, and isn’t, acceptable to use AI in their education, how to appropriately reference the use of AI in coursework, and how to work closely with their teachers to learn the best strategies.

Further work was done with teachers on how to use AI most effectively, ensuring the use was conversational, crafting the most effective prompts and then building on these to train the AI model exactly what you need from it. As teachers, used to clearly articulating to students what we want them to do, observing their responses and then amending our instructions and guidance, we are ideally placed to be able to train AI models in the same way! Our academic staff have been able to opt into a range of CPD training sessions, learning how to model AI best practice in the classroom, how to use AI to personalise their teaching, provide feedback to students and how they can help tackle intense workloads to increase their contact time with students.

Appreciating the importance of the need to keep up to date with this rapidly developing area and to ensure that AI is seen as a tool for people to use, not to fear, we created and recruited to a new role Head of Cognitive Science and Digital Literacy. As an experienced teacher within Creative Media and enthusiastic user of technology, Jason Sharma-Pay is ideally placed to not only help us further develop our digital strategy but to better understand how human creativity, empathy, emotional intelligence and capacity for critical thought can ensure artificial intelligence enhances teaching, learning and student outcomes.

Related Posts