In the 14 years since I started working in social welfare, technology has evolved at an astounding rate. We've gone from MP3 players to Spotify, Blackberries to smart watches, DVD rental to streaming. Not to mention the rapid acceleration in generative AI that we've seen over the last few years.
But for frontline social workers, stepping into work each morning is like going back in time. Most social workers I speak to are using the same technologies and tools that I was using 14 years ago. They are not seeing the benefits of these rapid developments in tech and AI. And it's holding them back.
Technologies like AI will of course not solve everything - but we mustn't let its limits blind us to its potential. AI can release frontline workers from administrative burdens, giving them more time to connect with people in need.
At Beam, the company I co-founded, our team of caseworkers reported that notetaking during meetings was putting an impersonal wedge between them and the person they are supporting. They told us that they wanted to be out supporting people, but were instead spending too much time behind a desk filling out forms.
We developed Magic Notes, which uses AI to transcribe meetings and automatically write up case notes, letters and referrals. External evaluation in 28 UK authorities found that the tool saves the average social worker eight hours a week in admin. Scale that impact up, and these time savings could translate to as much as £2bn annually.
Transcribing meetings is just the start. There is vast potential for AI tools to improve public services. For example, streamlining case management, tracking progress and flagging high-risk situations early on, catching and preventing problems before they occur. Beyond social work, AI tools can reduce administrative burdens for frontline workers across services like housing, employability and probation.
But councils and technologists can't do it alone. The new Government's first Budget is an opportunity to ensure AI is not only adopted at pace, safely, but also done so in a way that preserves the compassion, empathy, and judgement that are the hallmarks of social care.
To achieve this, we need more specialised procurement regimes operating in a nimbler, smarter way so that councils can pilot the responsible use of AI tools.
Government should also make it easier for local authorities to evaluate AI solutions. In the capital, the London Office for Technology and Innovation has helped build a culture of testing and experimentation. We need to see this approach adopted across the board and endorsed by ministers.
Finally, more could be done to empower councils to leverage their data so this can be put to work for the benefit of service users. All too often, well-intentioned regulations prevent data from being used effectively, and to the service of residents. As the range and capabilities of AI tools and services develop, local authority leaders need to be empowered – without compromising personal information – so that services can be better designed and built around local needs.
Our public services are overstretched and under-resourced. It's bad for service users, who find long waiting lists, short appointments and fragmented support. But it's also bad for frontline workers: who got into the job to make a difference and do impactful work, but find themselves stuck behind desks wrangling with databases and clunky, decades-old systems.
I see a future where AI and human care co-exist, each enhancing the other. Together, we can modernise social care in a way that benefits everyone—frontline workers, those they support, and the taxpayers who fund these vital services.
Seb Barker is a former frontline support worker and the co-founder of Beam, a social impact company