Title

SOCIAL CARE

We need to talk about machine learning

Michael Sanders believes around 10 local authorities are already using systems and tools that include machine learning to identify risky children’s social care cases. Here, he expresses his concerns.

Talk of machine learning and artificial intelligence for most people, depending on their age and preferences, conjures up either the Stanley Kubrick classic 2001: A Space Odyssey, one of the various entries of the Terminator franchise, or I, Robot.

Machine learning exploits the huge power of modern computers to run millions of calculations quickly and to uncover patterns in data that no human-led analysis could ever find. The models these produce can have high levels of accuracy. But because of their complexity – they can contain hundreds of variables, often interacting with each other in complicated ways – they can be a ‘black box', so we cannot easily identify the source of the predictions. There is also an ever-present risk of bias in the data, or the way that it is being propagated by the model, which can have serious negative consequences.

Alongside these concerns, there are many in governments, both local and national, who see the potential of predictive analytics to improve the accuracy of risk assessment by considering complex data quickly.

A previous study conducted by the Behavioural Insights Team, which looked at the risks of escalation for children's social care cases that had been assessed but where no further action had been taken, found that machine learning performed better than traditional statistical techniques, so there is some basis for this belief.

First, because the lack of transparency about how these tools are being used makes it hard for the children and families potentially affected by them (as well as society at large) to know what is happening or why.

My second concern is about effectiveness. Machine learning uses a lot of computing power to try to make its predictions, and can potentially find patterns that both humans and simpler forms of statistical analysis can miss.

However, there are details that people like social workers will pick up which a machine won't see – things like tone, inflection, or context that it is hard to contain even in the richest dataset. Similarly, in a lot of cases, more basic analysis will perform just as well.

Without transparency about how these tools are being used, how well they work, and how biased they are, the potential for harm – or at least, a lot of wasted money – is enormous. That is why at the What Works Centre for Children's Social Care we are hoping to work with local authorities over the next year to understand the ethics and the effectiveness of these techniques and to share what we find publicly. We urgently need a debate on this, and we are keen to be part of the conversation.

Michael Sanders is the executive director of the What Works Centre for Children's Social Care. He was previously chief scientist at the Behavioural Insights Team

SOCIAL CARE

Rachel Reeves: The Queen of fiscal drag

By Mike Emmerich | 28 November 2025

The chancellor may have done respectably on putting the public finances on a more secure long-term footing, but her measures do little to stem Britain’s post...

SOCIAL CARE

Budget: Putting stability in the spotlight

By Dan Corry | 28 November 2025

Dan Corry says that if the measures in the Budget can lead to some stability that allows growth to emerge then we will all gain.

SOCIAL CARE

NCASC: Children's minister pledges to 'reinvent' fostering

By Ann McGauran | 27 November 2025

The children’s minister has pledged to expand and ‘reinvent’ fostering to reverse the ‘expensive and inappropriate’ growth of residential care.

SOCIAL CARE

Fighting back with AI

By Sharon Lea | 27 November 2025

Investment in AI-powered CCTV and a local law enforcement team is delivering real results in the fight against crime in Hammersmith and Fulham and beyond, wi...

Popular articles by Michael Sanders