Can we equip our young people with the skills required to harness emerging AI technologies? Is it possible to prepare them for a world with radically different jobs and careers, working alongside AIs? Where should schools begin adapting the curriculum, especially when the teachers themselves aren’t yet experienced in the very skills we would want to pass on?
To us at Eton, the heart of this endeavour is how emerging technologies can directly benefit the students. Although the questions ‘how can AIs help teachers plan lessons’, and ‘how can AIs do marking and assessment’ are interesting, of significantly higher priority is the question of ‘how can students directly interact with AI technologies to benefit their learning and their wider development’.
One potential starting point is to map established digital skills to their AI literacy counterparts. Let’s begin by breaking digital skills down into four strands:
Firstly, ‘navigating systems and solving problems.’ The established skills of navigating operating systems and the web, evolve into recognising AIs and their uses, including agentic AIs that have the capacity to act independently. Knowing what software or apps to use, and when, naturally translates to choosing the right AI tools for a task. Although there isn’t a clear parallel to the established skills involved in digital organisation and device maintenance, something equally critical to extracting the most out of emerging technologies is prompt engineering, which is in effect an extension of natural language literacy.
Secondly, ‘digital creativity and collaboration.’ The natural extension to the digital communication skills that students will have previously gained is co-creating with AI. For example, AI supported planning and draft-feedback-refine loops, both of which involve communicating intent and filtering suggestions. Similarly, the sharing of digital documents and collaborative workflows transitions into integrated and connected AIs, that have access to your files and resources to improve productivity. Multimedia content creation, already an essential digital skill, is enhanced with multimodal AIs, working with images and voice, both as inputs and outputs.
Thirdly, the vital strand of ‘critical thinking and online safety.’ Determining factual accuracy becomes even more relevant with AI, demanding an added awareness of accuracy, bias, and alignment in generative AI outputs. Student understanding of online risks segues to the ethics and impacts of AI use. Additionally, existing endeavors to engender the maturity to moderate recreational use of digital platforms now runs in parallel to an awareness of the AI algorithms that make them so compelling, and often addictive.
Finally, ‘academic application and innovation.’ Clearly, research skills are augmented by AI, but it’s the opportunities for personal tuition that are potentially the most impactful of all AI literacy skills – more on this later! Existing computer programming skills can be built on to understand machine learning methods, i.e. how AIs actually work. Likewise, physical computing, e.g. tinkering with Micro:bits, progresses into the development of AI powered robots.
Equipped with this ‘wish list’ of skills we’d like to develop in our students, the next question is how?
Teacher training is worth considering. INSET sessions could be reserved, awareness raised, and discussions initiated. The challenge, of course, is who will deliver it? And afterwards, will teachers actually use the skills from the training when delivering the curriculum?
Another strategy is resources and courses, facilitating self-paced, differentiated training. If you’re not in a position to prepare your own, there’s a range of ‘off the shelf’ content (OpenAI Academy, Google training resources, Raspberry Pi’s Experience AI, and many more). These don’t need to be restricted to teachers, students can participate, too. However, experience shows uptake can be low if these are optional. Training resources can also go stale very fast in such a rapidly developing field.
We could let Microsoft or Google take the strain and surrender to platformisation, centring our focus on Copilot or Gemini. (Platformisation refers to the influence of the tools, features, and algorithms of a digital platform shapes how individuals and institutions operate, interact, and make decisions.) Realistically, this will happen without us acting, as it’s in the interest of giant tech companies to educate us in how to use their AI offerings. It’s not a free lunch though, as we risk platform lock-in and curated experiences designed for a U.S. audience.
Could the computer science curriculum encompass emerging technology education? Lessons are usually taught in computer rooms, and all students will have timetabled lessons, at least until the end of KS3. Fundamentally though, computer science GCSE and A level specifications only briefly touch on AI, and almost certainly pre-date generative AI systems like ChatGPT. An experienced CS teacher might not be an AI expert, and that’s even before we consider the national shortage of CS teachers. Also, if emerging tech is limited to CS lessons, that doesn’t help other teachers.
There is a strong argument for AI becoming a new subject in its own right. It would be highly valuable to employers, which might give the incentive for a qualification body to give it their support. Sadly, though, computer science professionals with AI and machine learning backgrounds are in demand across all sectors, and a new subject would further exacerbate the shortage of computer science teachers. If the new subject were non-technical, i.e. avoided the ‘how they work’ and focused just on ‘how to use them’, it might be more feasible but of significantly less value.
Pulling further on the non-technical thread, how about the PSHE programme and assemblies? The economic and societal implications of AI are a natural fit here, and there’s a range of options for delivery style. However, with the statutory requirements that must be covered, time is very limited. Since there’s no guarantee of device availability, the effectiveness of sessions may be limited, too. And, of course, we have the same dilemma as teacher training, who will deliver it?
If agreed on by senior leaders, the approach of delegation and integration, where responsibility for AI literacy is shared out across subject areas and built into schemes of work, is worth exploring. In theory, this could share the workload and maximise coverage. Curriculum departments could focus on what’s valuable for their subject. What are the cons? Well, quality control would be challenging. Inevitably, we circle back round to teacher training, since non-specialists would be asked to deliver guidance to their students. Time would need to be allocated, both this training and for actually delivering the content, which could mean less time for covering the specification – when teachers are already facing time pressures to get through the course, this might not be well received.
Perhaps, we are missing the best resource of all, the students themselves! Could the solution be role reversal, where the students become the teacher. Many teachers are fully prepared to admit their student are more technologically confident then they are. A significant number of classes (although not all – which could be the Achilles’ heal) could be home to a student who is able to share their beneficial uses of AI with the group. This would be a valuable confidence boost for the students and promotes academic honesty. On the other hand, teachers may find it hard to ‘let go’, or might have to re-teach ideas outside of their comfort zone – although a student may possess knowledge, that doesn’t mean they know how to teach it effectively.
Is it possible for the teacher to remain the lead without requiring sophisticated skills themselves? By modelling and scaffolding AI powered activities, we could potentially bypass gaps in teacher knowledge whilst still unlocking the benefits of emerging technologies for learners. Eton’s in-house developed CAITLibot (Conversational AI for Teaching and Learning, ‘ideas-bot’) is a website for teachers that generates bespoke ‘recipes’ for AI powered learning activities. Teachers can select a subject and specify a topic, then choose from a range of generated prompts which ones they’d like to send to students to complete on their own devices, using ChatGPT or similar AI. In short, CAITLibot does the prompt engineering for you, scaffolding interactive activities grounded in tried-and-tested pedagogic approaches. It also models good ways of using AI, showing how to turn AIs into personal tutors.
Since none of these approaches is a magic bullet, it would be wise to combine several of them… until, that is, we see a genuine paradigm shift.
Our current education system is based on the 150-year-old idea of classrooms and end-of-year assessments. What if qualifications were awarded like driving tests, or music performance levels, booked on a day that’s appropriate for the student? Rather than all students taking their GCSEs in the summer aged 16, why not let students achieve their ‘Grade 7’ Maths qualification on one day and their ‘Grade 4’ English Literature on another, irrespective of their age and the season. If AI were harnessed to deliver bespoke assessments, with human moderated auto-marking, this could become a reality.
Exams are principally a test of memorisation and recall. Even complex problem-solving tasks are typically defeated having studied enough past paper questions. Post education, how often are people required to undertake such problem solving without access to a wealth of online tools? Why must assessments be based on such a restrictive paradigm? Could we focus on skills and character traits, rather than memory – wouldn’t that be far better preparation for real careers? How to compare old and new qualification would require careful thought, but there are numerous examples already of existing qualifications not mapping to hiring decisions.
Clearly, a huge investment would be needed in hardware, software, and crucially staff retraining. However, it is exceedingly likely, that this is exactly the transformation most businesses and enterprises are either about to undertake, or well on their way to completing already. If education’s purpose is to equip people for a successful life, should it not model likely life experiences?
Until such a paradigm shift occurs, though, thoughtful use of emerging tools has significant potential to raise attainment today, independent of future considerations.