The AI Adoption Gap: Practical steps for local authorities to turn hype into service improvement
- Yannick Mitchell
- Oct 3
- 4 min read
In this blog, Yannick Mitchell considers the gap between expectation and reality and outlines three practical steps local authorities can take to bridge the gap between hype and meaningful service improvement.
I recently attended an event hosted by Coram titled ‘From Insight to Impact: AI Intersectionality and the Future of Children’s Social Care’.
Among the excellent speakers and interesting discussions, there was one line in particular that resonated with me. Jabed Hussain, Associate Director of Business Efficiency and Digital Transformation at Kingston and Richmond, shared his view that local authorities (LAs) were “11/10 on excitement and 5/10 on readiness” when it comes to the adoption of AI.

The Local Government Association (LGA) report, ‘State of the Sector: AI Update' supports this view. The report highlights that 48% of survey respondents thought their LA had the right policies and procedures in place, 29% thought their data (availability, quality and storage) was ready, and only 21% thought the workforce had the skills, knowledge or expertise to adopt or continue to adopt AI.
In an ideal world, readiness would precede adoption, but when I speak to LAs about AI, it’s clear that whether the policies are in place, the data infrastructure is right, or the workforce is prepared, they are already procuring and using AI tools. The LGA report found a 10% increase in respondents saying they were using or exploring the use of AI when compared with the previous year.
How is AI being used?
In my role as a Senior Consultant at Mutual Ventures, I’m fortunate to work with LAs across the country on the biggest transformations happening in children’s social care, and it never takes long before the conversation pivots towards AI and whichever tool has recently been procured.
LAs are already using AI in a number of ways, but in the LGA report, ‘staff productivity’ and ‘service efficiencies’ are the only two of 10 categories in which the benefits realised outweighed negligible benefits. So why is AI not bringing more benefits, and what steps should LAs take to unlock the potential of AI?
1) Start with the problems before looking for AI solutions
There is no AI tool that will solve every challenge that a service is facing. An AI tool is only ever likely to be part of a solution. That’s why it’s important to start with the problem and not the tool. Magic Notes is a good example of this. If the problem is that social workers are spending too long typing up case notes and assessments, then a transcription tool can make this process more efficient. But it shouldn’t remove the process in which a social worker reviews the notes for accuracy and corrects any errors which will inevitably appear (Magic Notes evaluations have raised challenges when transcribing meetings with people who have speech difficulties). An evaluation in Kingston Council found that the time social workers spent on case notes and assessments was reduced by 50 to 60 per cent. Its use is now widespread across the country.
But if the problem is the quality of case notes or assessments and not the time spent on them, then an AI-enabled transcription tool won’t be the answer. It might be part of the solution, but it’s not a replacement for upskilling staff through high-quality training.
The key is to take the time to define the problem, think about what the solutions might be and then consider the options. That might be AI, but it might not be.
2) Understand how staff are using AI and develop a strategy to support them
Just over two-thirds of respondents to the LGA survey found that “project scoping: understanding where AI can add value” was a barrier when it comes to procuring AI tools. I think this is due to the lack of strategies that LAs have which make clear the ways in which they intend to use AI and what the expected outcomes should be. How do you procure the right AI tool for your teams if you don’t have a clear idea of what you want to achieve?
I suspect there is also a significant gap between what an LA thinks their staffing group is using AI for and what is actually taking place. Without a strategy or policy guidance in place, there is a risk that staff may use unlicensed and free versions of popular AI tools to produce work. This could have significant data protection implications. An AI strategy alongside usage guidelines would support staff to develop good practice and increase their confidence in using AI tools. Then we might expect to see some of the potential benefits of AI being realised.
3) Take an ongoing learning approach to using AI
Another message that resonated with me from the event was that so often, discussion and evaluation of AI tools focus on the efficiencies they can provide practitioners. But what impact do AI-enabled services have on the families and children? How does an interaction with a social worker change if a parent knows there’s a device in front of them recording every word they say? LAs should engage with their service users as well as their staff to understand how using AI is impacting their experience of receiving or delivering services. LAs need to understand whether the AI tools they procured are achieving the desired outcomes and what, if any, unintended consequences might occur.
By focusing on real problems, supporting staff with clear strategies, and learning from both successes and challenges, local authorities can ensure that AI delivers genuine improvements for practitioners and the families they serve. Now is the time to move from hype to purposeful, evidence-led adoption.
To discuss the content of this blog post and how Mutual Ventures could support your organisation, contact Senior Consultant, Yannick Mitchell - yannick.mitchell@mutualventures.co.uk

