Webinar on the evolving use of AI in social work and social care: recording, slides and write up from session with Digital Care Hub, Skills for Care, BASW and Social Work England.
Speakers:
- Katie Thorn, Director of Innovation, Digital Care Hub
- Sarah Gilbert, Head of Adult Social Care Workforce Strategy, Skills for Care
- Natalie Day, Assistant Director – Policy and Strategy, Social Work England.
- Andrew Reece, Strategic Lead for Wales and England, BASW
Download a PDF of the presentation
Scroll down to read a summary (coming soon) and access useful links.
View a recording below
Summary
AI is already being used
AI isn’t something for the future – it’s already here. Across the sector, people are using it for:
- Writing case notes
- Transcription and recording visits
- Drafting reports
- Supporting admin tasks
Many people are seeing real benefits:
- Less time on admin
- More time with people
- Better workflow and efficiency
Some examples in research shared included reducing case note time for social workers by up to 30-50%.
But it’s not always being used with guidance
One of the key challenges is consistency. Some people are using AI with clear direction from their organisation and others are using it informally, without guidance. We also heard that most newly qualified social workers haven’t had any training on AI. This means people are often figuring it out as they go.
This is a workforce issue, not just a tech issue
A big theme throughout the session. AI is changing how work gets done, what skills are needed, and how services are delivered. So this isn’t just about technology. It’s about how we support the workforce to adapt confidently and safely.
Clear agreement on boundaries
There was strong alignment across all speakers on key points:
- AI can support practice, but not replace it
- Professional judgement must stay with people
- Social workers remain responsible for their work
- AI should not be used to make important decisions about care
There are real risks to manage
Alongside the benefits, we also discussed important risks:
- Data protection and confidentiality
- Bias in AI systems
- Inaccurate outputs (“hallucinations”)
- Over-reliance on AI
There were also concerns about losing reflective practice from the omission of writing which AI has replaced in many cases. There was an argument that writing isn’t just admin, but part of how we think and make sense of complex situations.
Responsible AI matters
A key focus for Digital Care Hub is responsible use of AI. Through the work with The Institute for Ethics at the University of Oxford and Casson Consulting, we’re looking at what responsible use of AI looks like in practice.
At its core, this means making sure AI:
- Supports people, not replaces them
- Protects dignity, choice and independence
- Is shaped through co-production from people who draw on care and support.
We announced in March 2026 that we will be hosting an alliance on AI and social care, focused on bringing stakeholders together to push for responsible use.
Useful links
- Adult Social Care Workforce Data
- Workforce Strategy
- AI and the Future of Social Work: A One-Day Summit (Event Page)
- BASW NOPT Annual Conference 2026 (Eventbrite Tickets)
- Ada Lovelace Institute Report: Scribe and Prejudice
- Nesta Research Report: AI SRAL MagicNotes
- Social Work England: The Emerging Use of Artificial Intelligence (AI) in Social Work
- OpenLearn AI Hub (OU’s Free Learning Platform)
- BASW AI Guidance for Social Work
- Oxford Ethics AI
- Digi Leaders: Bias & Ethics Topic
- CoPilot Vulnerabilities Whitepaper (Google Docs)
- YouTube: The Big AI Server for UK
- Think Local Act Personal: Principles and Priorities for Responsible Use of Generative AI in Care and Support
- BASW Practice Educator Professional Standards (PEPS)
- Council of Deans: Principles for the Use of Generative AI in Healthcare Education
- AI in Social Care Alliance
- Oxford Project – including Call to Action and Tech suppliers pledge