AI in edtech has officially moved past the gimmick phase.
But here’s the problem I keep seeing as a Product Manager and AI adoption consultant: most edtech products are still treating AI as a feature, not a foundation.
“AI-powered quiz generation.”
“Smart marking.”
“Lesson suggestions.”
Useful? Yes. Transformational? Not even close.
The real value of AI doesn’t sit in what users click, it sits in how your product thinks, acts, learns, and earns trust over time.
That’s why edtech teams need to start designing around an AI maturity stack, not a backlog of AI features.
From AI Features to AI Systems
If you strip away the marketing, most “AI-enabled” edtech tools today operate at a mid-level maturity:
- They respond to prompts
- They recommend actions
- They generate content
But the products schools will adopt, trust, and renew are moving towards something else entirely: agentic systems that work on behalf of staff. That shift only happens when you think in layers.
The AI Stack (EdTech Edition)
Here’s how I explain the AI maturity stack in plain English — with real edtech examples
1) Infrastructure
Cloud, compute, APIs — table stakes.
In edtech, this also means UK data residency, safeguarding, uptime, and compliance. If this layer isn’t solid, nothing above it matters.
2) Agent Internet
This is where AI stops waiting for prompts.
Example: an AI that monitors attendance trends, behaviour logs, and safeguarding signals and proactively flags risk, instead of waiting for a teacher to notice.
3) Protocols
Interoperability isn’t optional anymore.
If your AI can’t talk to the MIS, assessment tools, behaviour systems, and comms platforms, it’s just another silo, no matter how “smart” it sounds.
4) Tools
Can your AI do things, not just suggest them?
Example: automatically updating records, drafting parent comms, scheduling meetings, or logging interventions, with human approval baked in.
5) Cognition
This is decision support, not automation theatre.
Example: prioritising which pupils need intervention this week, not just listing “at-risk” students.
6) Memory
Now it gets powerful and personal.
The AI learns what works:
- Which interventions stick
- Which pupils struggle at certain times
- How each school operates differently
No more “one-size-fits-all AI.”
7) Applications
This is all the school actually sees.
A simple dashboard that says:
- “Here’s what needs attention today”
- “Here’s what I’ve already prepared”
- “Here’s what I recommend next and why”
8) Governance
The most overlooked layer and the one schools care about most.
Audit trails. Escalation paths. Human-in-the-loop controls. Safeguarding by design.
No governance = no trust.
No trust = no renewal.
Why This Matters for Product Leaders
Most AI roadmaps I see in edtech are shallow:
- Short-term feature wins
- Demo-friendly outputs
- Little thought to long-term system behaviour
That’s risky, because:
- Schools will expect smarter, connected systems
- AI is changing how software is used, not just what it can generate
- The winners won’t “add AI” they’ll be built on it
If a competitor builds AI that genuinely reduces staff cognitive load, integrates cleanly with existing systems, and earns trust…
Feature-led products will lose and fast.
What to Ask Yourself Now
If AI is on your roadmap, ask:
- Which layers of the AI stack are we actually operating in?
- What’s our plan to move up the stack over time?
- Are product, engineering, and governance aligned?
- Are we building something schools can trust, not just trial?
Final Thought
Edtech is shifting from “tools schools use” to “systems that work on behalf of schools.”
That shift doesn’t come from better features, it comes from better layers.
If this resonates, I’d love to hear how others are thinking about AI maturity in their products

Leave a Reply