AI & Back to School: The terms you agreed to over summer
AI & Back to School: The terms you agreed to over summer
AI & Back to School: The terms you agreed to over summer
Every August, the ritual begins. Credentials get dusted off, Wi-Fi passwords get reset, and someone discovers that the system update scheduled for July somehow didn’t happen. Familiar, but manageable chaos.
This year though, there’s a new item on the list. And it’s a bit harder to fix with a quick restart.
AI is no longer optional
Whether your school has an AI strategy or not, the tools you’re already using are quietly getting AI features baked in. New toggles appear in dashboards. Features get announced in product update emails that nobody reads until something goes wrong. The question is no longer whether AI will be part of the school day. It was already decided for you.
From “what should we try?” to “wait, what did we agree to?”
For years, edtech decisions were driven by enthusiasm more than evidence. New platform, new pilot, new promise. Schools were encouraged to experiment, move fast, and figure out the details later. A lot of details got left behind.
Now, school leaders across Europe are asking harder questions about the tools they’ve accumulated: what’s actually working, what’s collecting more than it should, and what exactly did we sign when we clicked “accept updated terms.”
That’s a genuinely good development. It’s also a bit overdue.
What to actually do before the school year starts
Start with a platform audit. Which tools has your school been using, and which of them have quietly introduced AI features in the past year? The answer is probably more than you’d expect. The follow-up question is whether those features came with updated data processing agreements, because under GDPR, schools bear legal responsibility for what happens to student data once it leaves their hands.
This isn’t overcaution. It’s just good governance. European schools are actually well equipped for this moment. GDPR gives schools real say when it comes to demanding clarity from vendors, and the EU AI Act is adding another layer of requirements around transparency and accountability for AI systems, used in sensitive contexts like education. The frameworks exist. The trick is treating them as tools rather than paperwork, and asking vendors the right questions before the contract is signed, rather than after something goes wrong.
The tools you don’t know about
Here’s the part schools rarely want to talk about: the AI use that’s already happening, just not through officially approved channels.
Teachers paste lesson plans into ChatGPT. Students use AI writing assistants that the school has never heard of. A well-meaning administrator runs student performance data through a free AI tool to save time on a report. None of this shows up in your platform audit, because none of it was sanctioned in the first place.
This is shadow use, and it’s widespread. It’s also not primarily a discipline problem. It’s a gap problem. When schools don’t provide tools that help, people find their own. When the official platforms feel clunky or slow or disconnected from what teachers actually need to do, workarounds happen. The response to shadow use isn’t a stricter policy. It’s asking honestly why people felt they needed to go elsewhere, and whether the answer to that question is something you can fix.
That means getting ahead of it before the school year starts. A clear, realistic AI use policy matters, one that acknowledges AI exists and gives staff and students a sensible framework, rather than pretending a ban will hold. More importantly, it means making sure the approved tools are good enough that people want to use them.
Talking to your teachers before talking to your tools
Many teachers are genuinely excited about what AI can do for their workload. Many are also exhausted by the pace of change and quietly relieved when someone asks how they actually feel about it. Back to school is the right moment for that conversation.
It’s also a good moment to point them toward AI built with teachers as the starting point, not the afterthought. The question worth asking is whether an AI tool is designed around the tasks that actually steal teaching time; planning, structuring activities, preparing materials, adapting content for students who need a different approach, or whether it’s been bolted on as a feature. Differentiation is one of those things every teacher knows matters and almost nobody has enough hours to do properly. This is exactly where well-designed AI should be pulling its weight. And critically, schools should be asking where their student data actually lives and who controls it. Not AI that replaces teacher judgment, but one that clears enough space for teachers to actually use it. The kind of tool that, ideally, means they don’t need to go looking elsewhere.
And yes, also: check that your integrations work, your access permissions are up to date, and any shiny new features get the training and onboarding they deserve. The boring stuff still matters.
The real question
Underneath all the practical preparation is one question worth sitting with: whose interests does your school’s technology actually serve?
The platforms with the biggest market share were mostly built by companies whose business model has very little to do with education. That doesn’t automatically make them bad choices, but it does mean schools shouldn’t assume alignment where none was designed. When the product is free, or heavily discounted, or bundled into something else, it’s worth asking what’s being exchanged.
It’s also worth asking where your data actually lives. Vendors hosting data outside the EU operate under very different rules, rules that can require them to hand over data to third parties in ways that would simply not be permitted here. It’s a question that belongs in every procurement conversation.
And there’s a meaningful difference between platforms built in Europe for European schools, and platforms built for a global market that have since been adapted to meet European requirements. One starts from the same regulatory reality you operate in every day. The other is catching up.
Students deserve platforms built around learning. Not platforms that treat learning as a data source with a login screen in front of it.
At itslearning, our approach to AI starts with a simple principle: it should serve teachers, not the other way around. That means AI that’s built into the learning platform rather than layered on top, with student data kept firmly within the environment schools already trust. Later this year, we’re launching the AI Teacher Toolkit; features designed specifically around the work that eats into teaching time, from planning and structuring activities to differentiating content for students who need a different approach. The right tools don’t just save time, they change how learning happens. So what could you do with a few hours back each week?
To find out more about itslearning, visit https://itslearning.com/

Author – Stina Boge, Head of Marketing & Communication at itslearning
About the Author – Stina Boge is Head of Marketing & Communication at itslearning, where she works at the intersection of technology, communication, and public affairs. With over 17 years of marcom experience across multiple industries, she joined itslearning in 2022 — initially focusing on product communication before expanding into broader marketing and strategic work. She takes a particular interest in the policy and strategic questions shaping digital technology in schools, and what those choices mean for educators, students, and the platforms built to serve them.
You might also like...
Nurturing Student Agency in Secondary Mathematics: Challenges and Opportunities
17th June 2024
Reflections on an Unusual Admissions Season: Are We Seeing More Girls on the Autism Spectrum? What are You Seeing?
4th November 2023
Storytelling as a way of mapping student learning
9th July 2024
Leading through Complexity: Why Wellbeing Intelligence is The Bedrock of Sustainable Success
30th April 2025
International Coach “State of the Industry” Survey 2023
20th November 2023