SOCIAL CARE

We need to talk about machine learning

Michael Sanders believes around 10 local authorities are already using systems and tools that include machine learning to identify risky children’s social care cases. Here, he expresses his concerns.

Talk of machine learning and artificial intelligence for most people, depending on their age and preferences, conjures up either the Stanley Kubrick classic 2001: A Space Odyssey, one of the various entries of the Terminator franchise, or I, Robot.

Machine learning exploits the huge power of modern computers to run millions of calculations quickly and to uncover patterns in data that no human-led analysis could ever find. The models these produce can have high levels of accuracy. But because of their complexity – they can contain hundreds of variables, often interacting with each other in complicated ways – they can be a ‘black box', so we cannot easily identify the source of the predictions. There is also an ever-present risk of bias in the data, or the way that it is being propagated by the model, which can have serious negative consequences.

Alongside these concerns, there are many in governments, both local and national, who see the potential of predictive analytics to improve the accuracy of risk assessment by considering complex data quickly.

A previous study conducted by the Behavioural Insights Team, which looked at the risks of escalation for children's social care cases that had been assessed but where no further action had been taken, found that machine learning performed better than traditional statistical techniques, so there is some basis for this belief.

First, because the lack of transparency about how these tools are being used makes it hard for the children and families potentially affected by them (as well as society at large) to know what is happening or why.

My second concern is about effectiveness. Machine learning uses a lot of computing power to try to make its predictions, and can potentially find patterns that both humans and simpler forms of statistical analysis can miss.

However, there are details that people like social workers will pick up which a machine won't see – things like tone, inflection, or context that it is hard to contain even in the richest dataset. Similarly, in a lot of cases, more basic analysis will perform just as well.

Without transparency about how these tools are being used, how well they work, and how biased they are, the potential for harm – or at least, a lot of wasted money – is enormous. That is why at the What Works Centre for Children's Social Care we are hoping to work with local authorities over the next year to understand the ethics and the effectiveness of these techniques and to share what we find publicly. We urgently need a debate on this, and we are keen to be part of the conversation.

Michael Sanders is the executive director of the What Works Centre for Children's Social Care. He was previously chief scientist at the Behavioural Insights Team

SOCIAL CARE

Evolution, not revolution

By Lee Peart | 05 September 2024

Lee Peart considers the Fabian Society’s routemap to a National Care Service as a means of addressing the challenges facing adult social care.

SOCIAL CARE

CRUD – the key to successful transformation?

05 September 2024

Ignore data fundamentals at your peril, warns Campbell Tickell's Alistair Sharpe-Neal.

SOCIAL CARE

Council information call ahead of early prisoner release

By Dan Peters | 05 September 2024

Councils have called for more information as the Government prepares to ‘temporarily’ reduce the proportion of some sentences served amid capacity pressures ...

SOCIAL CARE

Doing business rates differently

By Mark Sandford | 04 September 2024

Kevin Muldoon-Smith and Mark Sandford examine potential options for changes to business rates, ranging from modest or major changes to introducing a new tax

Popular articles by Michael Sanders