SOCIAL CARE

We need to talk about machine learning

Michael Sanders believes around 10 local authorities are already using systems and tools that include machine learning to identify risky children’s social care cases. Here, he expresses his concerns.

Talk of machine learning and artificial intelligence for most people, depending on their age and preferences, conjures up either the Stanley Kubrick classic 2001: A Space Odyssey, one of the various entries of the Terminator franchise, or I, Robot.

Machine learning exploits the huge power of modern computers to run millions of calculations quickly and to uncover patterns in data that no human-led analysis could ever find. The models these produce can have high levels of accuracy. But because of their complexity – they can contain hundreds of variables, often interacting with each other in complicated ways – they can be a ‘black box', so we cannot easily identify the source of the predictions. There is also an ever-present risk of bias in the data, or the way that it is being propagated by the model, which can have serious negative consequences.

Alongside these concerns, there are many in governments, both local and national, who see the potential of predictive analytics to improve the accuracy of risk assessment by considering complex data quickly.

A previous study conducted by the Behavioural Insights Team, which looked at the risks of escalation for children's social care cases that had been assessed but where no further action had been taken, found that machine learning performed better than traditional statistical techniques, so there is some basis for this belief.

First, because the lack of transparency about how these tools are being used makes it hard for the children and families potentially affected by them (as well as society at large) to know what is happening or why.

My second concern is about effectiveness. Machine learning uses a lot of computing power to try to make its predictions, and can potentially find patterns that both humans and simpler forms of statistical analysis can miss.

However, there are details that people like social workers will pick up which a machine won't see – things like tone, inflection, or context that it is hard to contain even in the richest dataset. Similarly, in a lot of cases, more basic analysis will perform just as well.

Without transparency about how these tools are being used, how well they work, and how biased they are, the potential for harm – or at least, a lot of wasted money – is enormous. That is why at the What Works Centre for Children's Social Care we are hoping to work with local authorities over the next year to understand the ethics and the effectiveness of these techniques and to share what we find publicly. We urgently need a debate on this, and we are keen to be part of the conversation.

Michael Sanders is the executive director of the What Works Centre for Children's Social Care. He was previously chief scientist at the Behavioural Insights Team

SOCIAL CARE

Lessons in resilience from the Grenfell tragedy

By Rob Wahl | 16 January 2025

Resilience has to be an intrinsic part of every department so that when a crisis hits, the response is quick, decisive and coordinated, says Rob Wahl.

SOCIAL CARE

Evolution through devolution

By Neil Blagburn | 16 January 2025

The council-led delivery of the Northumberland Line could inspire local authorities to take the lead on major infrastructure development, says Neil Blagburn.

SOCIAL CARE

Rebels with a cause

By By Dan Peters | 16 January 2025

The Local Government Association (LGA) has been warned some members could quit over its stance on reorganisation as it faces an internal rebellion.

SOCIAL CARE

Tackling inequality through physical activity in Cornwall

By Craig Handford | 16 January 2025

A project aimed at better understanding the contribution of physical activity in helping to change lives across Cornwall has been genuinely transformative, s...

Popular articles by Michael Sanders