Politics

Why are uk councils quietly using ai vendors to decide welfare eligibility — and can residents challenge the algorithms

Why are uk councils quietly using ai vendors to decide welfare eligibility — and can residents challenge the algorithms

I first started noticing a pattern while reporting on local government budgets: councils quietly buying software labelled as “decision support” or “eligibility automation” and then using it to decide who gets social care, housing support or back‑dated benefits. The technology is sold as a way to save money and speed up assessments, but for residents it can mean being reduced to a score or a flag inside a database — often without knowing how that decision was reached, or whether they can challenge it.

Why councils turn to AI vendors

Councils face relentless pressure to cut costs while managing rising demand for services. That’s fertile ground for private vendors pitching automated systems. The selling points are persuasive: claim X% fewer erroneous claims, Y% faster processing, and the promise of consistent, “objective” decisions. Companies such as Capita, Civica and various data analytics firms have been visible in local government procurement, alongside newer start‑ups offering machine learning models.

Local authorities buy these systems for three main reasons:

  • Budgetary savings: automation is presented as a way to reduce staffing costs and error rates.
  • Efficiency: AI can sift through large datasets, flagging cases for human review or producing recommendations.
  • Risk management: models can supposedly identify fraud or high‑risk cases earlier than manual reviews.
  • But the reality is more nuanced. Many councils are using these tools not merely to assist staff but to influence eligibility decisions — sometimes by prioritising which cases get a full assessment, or by generating a denial recommendation that a human worker must approve. That subtle shift changes how power is exercised and how accountable it is.

    What’s worrying about these systems

    There are three broad concerns I keep coming back to when I look at council AI deployments.

  • Opacity: Residents usually aren’t told that an algorithm has shaped a decision about their welfare. Contracts with vendors can be commercially sensitive, so councils sometimes withhold details about how models were trained or how thresholds are set.
  • Bias and injustice: Algorithms reflect the data they were trained on. If historical decisions were influenced by socioeconomic bias, the model can reproduce that bias at scale — for example, disadvantaging disabled people or those in particular neighbourhoods.
  • Reduced human oversight: Automation can encourage “rubber stamping.” If a model says “not eligible,” busy caseworkers may be less likely to investigate deeper or exercise discretion.
  • Those risks are not hypothetical. A model that underestimates care needs could leave people without essential support. An algorithm that classifies certain benefit claims as “high risk” may trigger intrusive investigations or fraud referrals without a clear path for appeal.

    Can residents challenge algorithmic decisions?

    Yes — but it’s often an uphill battle. Here are practical routes residents can pursue if they suspect an automated system has affected their case.

  • Request information directly from the council: Ask whether an automated tool was used, and request the decision rationale. Councils must provide reasons for adverse decisions under administrative law.
  • Subject Access Request (SAR): Under data protection law, you can request your personal data. That can include model inputs and any outputs stored about you. Use the Data Protection Act 2018 and the UK GDPR as the legal basis.
  • Freedom of Information (FOI) request: FOI can reveal the existence of contracts, procurement documents, and algorithmic impact assessments (if created). Some technical details may be withheld for commercial reasons, but FOI can still be revealing.
  • Complain to the ICO: The Information Commissioner’s Office handles data protection complaints. You can complain if you believe your data has been misused or a model breached data‑protection principles.
  • Judicial review: If a decision is unlawful — for example, if the council failed to consider statutory duties or relied on a flawed process — residents can seek judicial review. This is costly and time‑sensitive but has successfully overturned unfair local decisions in the past.
  • Equality Act claim: If an automated decision discriminates against a protected characteristic (disability, age, race, etc.), there may be grounds for an equality or human‑rights challenge.
  • Political and media pressure: Raising the issue with councillors, MPs or local journalists often produces results. Transparency and reputational risk can push councils to pause or revise AI use.
  • How to make a challenge practical — step by step

    From the cases I’ve followed, residents who win challenges tend to combine legal steps with public pressure and meticulous record‑keeping. Here’s a practical checklist I’ve seen work:

  • Keep every letter, email and decision notice related to your case.
  • Submit a Subject Access Request asking for “any automated decision outputs, scores, or notes generated by an algorithm regarding my case.”
  • File an FOI request for the vendor contract and any equality or algorithmic impact assessments.
  • Escalate internally through the council’s complaints process while simultaneously contacting your local councillor and MP.
  • If necessary, consult a solicitor specialising in public law or discrimination to assess judicial review or tribunal options.
  • Notify the ICO and keep a record of their response.
  • RouteTypical timeframeWhat you can expect
    Subject Access RequestOne month (can be extended)Access to personal data, possible model outputs
    FOI request20 working daysContracts, procurement, some technical docs (redactions possible)
    ICO complaintVaries (weeks–months)Investigation into data handling, potential enforcement
    Judicial reviewMonths to yearsChallenging lawfulness of process or decision

    What transparency looks like — and what I try to ask councils

    When I press councils for accountability, I ask a few specific questions that I think every resident should be able to ask too:

  • Was an automated system used in my (or this type of) decision?
  • Who supplied the system and what data was used to train it?
  • Was an equality impact assessment or algorithmic impact assessment completed and published?
  • What human oversight is in place — who signs off final decisions?
  • How is accuracy measured, and how often is the model audited or retrained?
  • Good practice would be publishing an algorithmic register, as some organisations do, showing where AI is used, for what purpose and how residents can appeal. The ICO has issued guidance on AI and data protection; local authorities should follow it, but many do not yet do so consistently.

    The bigger picture

    This is not just a technical issue — it’s about how we want state power exercised. Automated systems can help under pressure, but without transparency, accountability and strong safeguards they risk repeating and amplifying social injustice. Residents need practical routes to challenge decisions, and councils must be forced into the light to justify using algorithms for welfare decisions that affect people’s lives.

    If you think an algorithm shaped a decision about your benefits or support, start by asking the council directly and consider a Subject Access Request and FOI. Get local representatives involved, and if the case appears unlawful or discriminatory, seek legal advice. It’s a slow, sometimes frustrating process — but the alternative is accepting that crucial social choices are made by invisible code with little room for human oversight.

    You should also check the following news:

    What museums are losing when they outsource collections to tech platforms — a guide for regional curators and community activists
    Culture

    What museums are losing when they outsource collections to tech platforms — a guide for regional curators and community activists

    Why I'm worried when museums hand collections to tech platformsI've seen this pattern firsthand: a...

    How a sudden shift in business rates relief could save indie cafés in seaside towns — what owners must do now
    Business

    How a sudden shift in business rates relief could save indie cafés in seaside towns — what owners must do now

    I’ve been watching what’s happening to small hospitality businesses in Britain’s seaside...