I’ve been reporting on tech and education for years, and lately I’ve had more parents ask me the same question: what should I know before my child’s school starts using AI tutoring apps from third‑party providers? These tools promise personalised help, instant feedback and more time for teachers. But they also raise real questions about data, equality, quality and oversight. Here’s a practical guide — written from my experience watching how schools adopt new tech — to help you ask the right questions and feel confident about how AI is used in your child’s learning.
What exactly do schools mean by “AI tutoring”?
Schools use the term in different ways. Sometimes it’s a simple homework‑help chatbot (think of ChatGPT, Google Bard or Microsoft Copilot) embedded in an app. Other times it’s a more structured adaptive learning platform — for example, apps like Century Tech, Squirrel Learning or third‑party platforms that personalise lesson sequences based on performance. The common thread is that these systems use algorithms to adapt content or give feedback to students without a teacher intervening on every interaction.
Data protection and privacy: the first, non‑negotiable issue
The biggest practical concern for parents is what data the app collects and who can access it. Under UK law (and GDPR), schools are required to ensure third‑party providers handle pupil data securely. But in practice, contracts vary and so do privacy features.
- Ask for the privacy notice and data processing agreement the school signed with the provider. It should say what personal data is collected, how long it’s stored, whether it’s used to train AI models, and whether data is shared with other companies or overseas.
- Find out if parent consent is needed. For many routine educational uses schools can rely on their lawful basis for processing, but any marketing use or profiling outside educational purposes usually requires explicit consent.
- Check for anonymisation. If the provider trains models on pupil interactions, does it anonymise or pseudonymise data first? If not, insist on clarity — and challenge it if needed.
- Ask about security certifications (e.g. ISO 27001) and where data is stored. Cloud servers in other jurisdictions can complicate legal protections.
Accuracy, reliability and risk of hallucinations
AI tutors can be brilliant at giving explanations or checking facts — but they also make confident mistakes, or “hallucinate”. That’s dangerous in a learning environment where a child might take incorrect guidance as authoritative.
- Ask about validation: How does the provider check the correctness of its answers? Are subject specialists involved in building the content?
- Does the app cite sources? Reliable systems should point to curriculum‑aligned resources, not just generate unreferenced text.
- Look for built‑in safeguards: Are risky or biased responses blocked? Is there a way for teachers or parents to flag poor outputs?
Equity and access
AI tools can widen or narrow gaps, depending on how they’re implemented. If only some pupils have devices at home, or if an app requires a paid upgrade to unlock key features, inequality grows.
- Ask the school about device provision: Will pupils who lack home devices get access? Are resources available offline?
- Check for cost barriers: Are families expected to buy subscriptions or in‑app purchases?
- Consider neurodiversity and language needs: Good platforms offer adjustable interfaces, text‑to‑speech, and multilanguage support. Ask whether the chosen tool has these features.
The teacher’s role — complementary, not replaced
One myth I frequently hear is that AI will replace teachers. In practice, effective roll‑outs put teachers in the loop: using AI to free time for human guidance, not to outsource classroom control.
- Ask how teachers are trained: Are staff given professional development to use the app and interpret its analytics?
- Ask who monitors results: Will teachers review AI‑generated recommendations and decide on interventions?
Safeguarding and wellbeing
When children interact with third‑party apps, safeguarding matters. The provider should have policies for protecting pupils from inappropriate content and for reporting concerns.
- Does the app have content filters? Can it be configured to avoid inappropriate material and limit unsupervised conversation for younger pupils?
- Is there human moderation? Automated filters help, but human oversight is essential, especially where sensitive issues or mental‑health queries appear.
Transparency and explainability
I believe parents and pupils deserve clear explanations of how decisions are made. Vague claims about “proprietary AI” shouldn’t be a shield from scrutiny.
- Ask for a plain‑English explanation of how the algorithm personalises learning and how recommendations are generated.
- Request examples: Real‑world anonymised case studies showing how the tool helped (or didn’t help) pupils are useful.
Practical questions for parents to ask the school
- Which company provides the AI tool and what references can they give from other schools?
- What data does the provider collect and can parents see a sample data log?
- Who can access my child’s data (school staff, vendor, third parties)?
- Will the app’s outputs be used in assessments or to make high‑stakes decisions?
- Can parents opt their child out without penalty, and what alternatives are available?
- How are inaccuracies reported and corrected?
- What training is offered to staff and pupils on safe, effective use?
- How will the school evaluate the tool’s impact on learning and wellbeing?
Spotting red flags
Watch out for these warning signs when a provider or school is too eager to roll out a tool:
- Contracts that allow the vendor to use pupil data for “product improvement” without clear anonymisation.
- Ambiguous statements about who owns the data or how long it’s kept.
- Pressure on parents to buy premium features or hardware.
- Lack of teacher training, or claims that the app will replace lesson planning entirely.
Small table to compare typical benefits and risks
| Potential benefit | Common risk |
|---|---|
| Personalised practice and faster feedback | Inaccurate answers presented confidently (hallucinations) |
| Data‑driven insights for teachers | Privacy lapses or overcollection of personal data |
| Extra support outside class hours | Unequal access if only some pupils have devices |
| Time saving for routine tasks | Overreliance leading to reduced critical thinking |
Finally, be ready to stay involved. Technology procurement and oversight in schools is not a one‑and‑done conversation. If your child’s school is planning a pilot, ask to see the evaluation plan and join any parent meetings. If something seems unclear or worrying, raise it with the headteacher or governors — and if necessary, with the local authority or the Information Commissioner’s Office. When schools bring third‑party AI into the classroom, informed parents are a crucial check on quality, equity and safety.