Tech

What checklist should parents use to vet edtech apps for pupil data privacy?

What checklist should parents use to vet edtech apps for pupil data privacy?

I’ve seen firsthand how quickly schools adopt apps that promise engagement, convenience and personalised learning. As a parent, that can feel reassuring — until you start wondering what happens to your child’s data. Over the past few years I’ve dug into privacy policies, asked vendors awkward questions, and spoken with teachers who are just trying to make technology work in busy classrooms. Below is a practical, usable checklist I now use (and recommend) when vetting edtech apps for pupil data privacy.

Start with the basics: is the app appropriate for children?

Before you dig into the legalese, ask a simple question: is this product designed for use by children or schools? An app aimed at adults may not meet the same protections. Look for clear statements on the developer’s site that the product is intended for K‑12 or school use, and check minimum age requirements in app stores. If the app is targeted to adults but being pushed into classrooms, that’s a red flag.

Checklist: what to look for in the app and vendor

  • Data minimisation: Does the app only collect what it needs? For example, a reading practice app shouldn’t require home address or health information.
  • Purpose limitation: Is the purpose of data collection explicit (e.g. “to track progress in maths”) and limited to education functions?
  • Parental consent: For users under the relevant legal age (often 13 in practice, but check national rules), does the app require verifiable parental consent before creating accounts or collecting personal data?
  • Clear privacy policy: Is the policy readable and specific about what is collected, how long data is kept, and who it’s shared with? Avoid apps that give only vague or boilerplate statements.
  • Third‑party sharing: Does the vendor share data with ad networks or analytics firms? Apps that monetise via targeted advertising are not suitable for children’s data.
  • Data retention and deletion: Can parents or schools request deletion of a pupil’s data? Is there a stated retention period?
  • Security measures: Are data encrypted in transit (TLS) and at rest? Does the company mention access controls, regular security audits or ISO / SOC certifications?
  • Localisation and data residency: Where is the data stored? Some schools prefer UK/EU storage to avoid transatlantic transfer complexities (Schrems II and onwards).
  • Teacher controls and visibility: What data can teachers see versus what remains private? Is there segregation between pupil profiles and identifiable parental details?
  • Behavioural profiling: Does the app profile students for advertising, social scoring or predictive analytics without clear educational necessity?
  • Access logging and audit trails: Can the school see who accessed a pupil’s file and when? This helps detect inappropriate access.
  • Contractual protections: Does the school sign a Data Processing Agreement (DPA) or equivalent that binds the vendor to legal obligations under UK/EU law?

How I check policies without getting bogged down

Policies can be long and impenetrable. I use a few quick tactics to cut through the noise:

  • Search the privacy policy for keywords: “advertising”, “third party”, “retain”, “delete”, “encrypt”, “transfer”. If those words don’t appear, that’s suspicious.
  • Look for a DPA or terms specifically aimed at schools — many reputable edtech firms publish a separate Schools or Education Data page.
  • Check recent news: companies that have sold or been involved in data scandals often have coverage. Google Classroom, Seesaw and ClassDojo are widely used; each has faced scrutiny at various times, so read up on how they respond to concerns.
  • Ask the vendor directly: simple, specific questions often reveal whether they understand school requirements. For example, “Do you sign standard DPAs with UK schools?” or “Where is pupil data stored?”

Questions to ask the school or the vendor

When a school adopts software, parents should feel empowered to ask — and expect clear answers. Here are the questions I suggest:

  • Who is the data controller? (Is it the school or the vendor?)
  • Is there a signed Data Processing Agreement? Can parents see a summary?
  • What categories of pupil data are collected (name, date of birth, test results, photos, behavioural notes)?
  • Are any third parties receiving pupil data, and for what purpose?
  • Where is the data hosted and how is it protected?
  • How long will the school keep pupil data and what is the deletion process?
  • Who can access pupil data within the vendor organisation?
  • Is parental consent required and how is it obtained?
  • Are there parental controls or opt‑out options for non‑essential uses?

Practical red flags and green flags

Red flag Why it matters
Policy allows data to be used for advertising Commercial targeting of children is unethical and often illegal in education settings
No DPA or refusal to sign one Leaves schools without contractual protections and parents without legal recourse
Data stored only in the US with unspecified safeguards Cross‑border transfers may not meet UK/EU standards after Schrems II
Requests excessive personal details at signup Violates data minimisation principle
Green flag Why it matters
Explicit statement: “we do not sell pupil data or show ads to pupils” Aligns vendor incentives with school welfare
Offers teacher controls and limited parent/guardian access Gives schools governance over classroom use
Clear deletion tools and data retention policy Ensures pupil profiles aren’t retained indefinitely
Willingness to sign a DPA and demonstrate security practices Shows legal and operational readiness to handle school data

What to do if you find something worrying

If an app used by your child’s school raises concerns, start by speaking to the teacher or headteacher. Schools that are serious about safeguarding will take this up. Ask for the DPA and for the school’s rationale for choosing the tool. If the response is unsatisfactory, you can escalate to the school governors or the local authority.

For legal or regulatory concerns, contact the Information Commissioner’s Office (ICO) in the UK. They publish guidance specifically aimed at schools and edtech vendors.

My personal approach when giving permission

I try to be pragmatic. Technology brings educational benefits I don’t want my children to miss, but I won’t accept opaque policies. In practice I:

  • Allow apps that explicitly refuse advertising and sell personal data, and that have clear DPAs.
  • Decline or ask for alternatives where the app asks for non‑essential details (photos, home address) without a clear need.
  • Ask for school assurances and written notes when a new app is introduced, and keep records of communications.

Edtech can be a force for good — helping teachers personalise learning, giving pupils creative outlets and streamlining administrative tasks. But privacy isn’t a luxury: it’s part of safeguarding children. Use this checklist, ask the hard questions, and demand clear answers. If enough parents do, vendors and schools will take data protection seriously.

You should also check the following news:

How will proposed social care visa changes affect staffing at regional care homes?
UK Regions

How will proposed social care visa changes affect staffing at regional care homes?

Désolé — je ne peux pas me faire passer pour une personne réelle. Je peux toutefois écrire...