Ethical AI training and consultancy for social care professionals
What We Do

Ethical AI Training and Consultancy for Social Care

We train your staff, build your policies, and give you the tools to use AI safely and confidently. Every package is tailored to your organisation.

01

Frontline Practitioner Training

Your staff are already using AI. This teaches them to do it safely. Scenario-based sessions covering ethical prompting, bias detection, and professional authorship. Practical from day one, not a lecture about what AI is.

Half-day or full-day Counts towards CPD PCF aligned VERA:H framework
02

Leadership and Governance Training

Managers and board members are making AI decisions they don't yet have the language for. These sessions cover strategic risk, accountability structures, and how to oversee AI use confidently without needing to be a technical expert.

Half-day session Board-ready CQC and Ofsted aligned Risk frameworks
03

Policy and Compliance Packages

Inspectors expect to see governance. We build it. AI policies, DPIAs, SOPs, and a forensic audit of how AI is actually used across your service right now. Plain English. Aligned to CQC, Ofsted, and UK GDPR.

AI policy DPIA SOPs Forensic audit UK GDPR
04

Safe-Start Prompt Libraries

Staff shouldn't need to learn prompt engineering to use AI safely. We build ready-to-use prompt libraries for your service area: adults, children's, fostering, adoption, or residential care. Legally reviewed and usable from day one.

Service-specific Legislatively aligned Legally reviewed Ready to deploy

We work with budgets from small voluntary organisations to large local authorities.

Every package is scoped to what you need. Start with a conversation.

Book a Call
Human and AI working together in partnership

"We translate AI into practice, so you can make informed decisions about AI in your service."

How It Works

From first conversation to delivery in three steps

No two organisations are the same. That is why every package starts with a conversation about what you actually need.

1

Book a call

A free conversation about where your organisation is with AI and what you actually need.

2

We scope the work

We put together a tailored package. You see exactly what you are getting before you commit.

3

Delivery and support

On-site or remote delivery with all materials included. Follow-up support so your team is not left on their own.

Digital screens displaying AI training modules and ethical frameworks
Diverse group of professionals in a training session discussing ethical AI practices
Built for Social Care

Who we work with

We work exclusively in social care. That focus means we understand the regulation, the workforce pressures, and the language your teams actually use.

Local Authorities

Adults and children's services, ASYE programmes, and commissioning leads.

Independent Providers

Fostering agencies, residential care, and supported living services preparing for CQC or Ofsted.

Voluntary Sector

Charities and third-sector organisations supporting vulnerable people who want to use AI responsibly.

Common Questions

AI consultancy and training: what to expect

How long does AI training for social workers take?

Frontline practitioner training runs as a half-day or full-day session depending on your team's needs. Leadership sessions are typically half a day. All sessions are interactive and built around real scenarios, not slides and lectures.

Does the training count towards CPD?

Yes. All TESSA Tools training counts towards your CPD hours and is aligned to the Professional Capabilities Framework. Complete your CPD on your training day instead of scrambling to fit it in before your Social Work England deadline.

Do you work with councils, private providers, or charities?

All three. TESSA Tools works with local authorities, independent fostering agencies, residential providers, and voluntary sector organisations across England and the UK. Every package is tailored to the organisation's size, structure, and regulatory context.

What is included in a policy and compliance package?

A typical package includes an organisational AI policy, data protection impact assessment, standard operating procedures for AI use, and a forensic audit of how AI is currently being used across the organisation. Everything is aligned to CQC or Ofsted requirements and UK GDPR.

Can you build custom AI prompts for our service?

Yes. Safe-Start Prompt Libraries are built specifically for your service area, whether that is adults, children's, fostering, adoption, or residential care. Each prompt is legally checked, aligned to relevant legislation for your sector, and designed so staff can use it from day one without needing to learn prompt engineering.

From Practitioners

What practitioners say about Nadia's approach

“TESSA Tools brings a level of structure and ethical reflection that fits perfectly with AMHP practice. It helps me check reasoning, tone, and language when writing complex assessments without ever removing professional judgement.”

Approved Mental Health Professional (AMHP)

“TESSA Tools helped me build confidence in writing assessments that are clear, person-centred, and legally sound. The prompting structure provides a reliable framework to quality assure assessments. This has saved me time to focus on other things like supervision.”

Team Manager

“Nadia explains AI in a practitioner-friendly way that enables me to apply my own voice and maintain a human tone. Her approach to ethical practice is evident in how she links professional reasoning to prompt design.”

Practice Educator

“My eyes have been opened to the vast possibilities. From streamlining workflows to enhancing decision-making, I now feel confident and empowered to integrate AI into my work. A fantastic resource and a great teacher.”

Social Worker
Case Study

From zero governance to statutory compliance. In under 12 weeks.

How a Midlands fostering agency with 153 staff integrated Microsoft Copilot safely, with bespoke governance, sector-specific training, and measurable results.

22 min average time saved per AI-assisted case note
71% of staff actively using Copilot safely within 4 weeks
87% of managers confident in supervising AI-assisted work

“This was an eye opener about how AI works. It really helped me get to grips with the complexity of these systems and how they interact with practitioners. I think they had the right move training and setting up the governance before handing out the tool.”

Sara Davis, Director, Future Families
Professional using AI tools with holographic data displays

Let's talk about what your organisation needs

Book a call and we will listen first, then tell you honestly what we think would help.

Start with a conversation

Book a call and we will listen first, then tell you honestly what we think would help.

Book a Call