Casework audit for extra-familial harm: are we auditing what matters?
By Rachel Ringham
N.B. in this piece, we use “audit” to mean case-focused quality assurance and learning activity across children and young people’s services, adults’ services and partners, not a single tool or a single team’s paperwork.
Casework audit is often treated as a quality assurance staple in children’s social care. Most of us can recite the reasons for it: assurance, compliance with statutory guidance, consistency, oversight, and so on.
But there is a truth behind audits: under pressure, systems reach for what is most legible and defensible. In high-scrutiny periods, when capacity is thin, we look for what we can evidence, we tighten processes, we standardise tools, and we score what we can see. Audit can drift into reassurance, recording activities that look solid on paper, even when practice is not yet shifting in ways that reduce harm, create safety and nurture agency and welfare with children and young people.
For extra-familial harm, drift matters. And it matters for young adults too, when needs and harms continue but systems are not designed to keep responding with anything like continuity. When we talk here about children and young people harmed outside the home, we mean realities faced by many young people from early teens into their mid-twenties. This includes exploitation, serious violence, peer-on-peer harm, coercion, online abuse, and the harms that sit alongside housing insecurity, criminalisation, mental health needs and disrupted family relationships.
Transitional Safeguarding keeps our attention on the continuity of needs, harms and responses, rather than treating eighteenth birthdays as the cut-off point of our responsibilities.
In practice, that means audit and learning need to travel across children and young people’s services, adults’ services and key partners, including education, policing, health, housing, youth and community settings, rather than staying pinned to a single case file.
This sits with questions we have been exploring in the Resonant Collaboration blog:
In our article on strengths-based practice and extra-familial harms, we looked at how well-intended approaches can, at the wrong moment, subtly shift responsibility onto young people and away from perpetrators, services and institutions.
In our aftercare article, we explored how anxiety and scrutiny can lead systems to tighten processes and reach for certainty. That is understandable, but it can make deeper learning harder to achieve. Audit sits right in the middle of all of this.
The uncomfortable question stays simple: does our audit tell us that practice is compliant, or that young people are safer? By “safer”, we mean reduced exposure to harm and increased day-to-day safety, agency and welfare as the young person experiences it, not a promise of zero harm.
Why audit matters here
At best, audit helps us understand how decisions are being made, what practitioners are noticing, what they are missing, and the conditions shaping practice. For extra-familial harm and Transitional Safeguarding, that means audit needs to stay close to the young person’s lived world, and close to how the partnership is holding risk around them. It also means looking for continuity: what stays steady after young people reach age 18, and where are we letting young people fall into gaps.
A useful audit will usually be able to answer six grounded questions, as prompts for inquiry rather than scoring items:
Are we recognising extra-familial harm early enough, including for young adults whose needs can become less visible across service boundaries?
Do we understand the young person’s context beyond the family home: routines, relationships, locations, online life, and the pull of peers?
Are partners who hold key knowledge and influence actively involved and coordinated (schools and colleges, police, youth services, health, housing, community organisations and others)?
Is there a safety plan that lives, meaning practical shifts in daily safety rather than recorded actions alone?
Can we see impact: reduced exposure to harm, and evidence that practice is reducing harm, creating safety and nurturing agency and welfare as the young person experiences it?
Is there grip and continuity through thresholds and handovers (step-up/step-down, discharge, move to adults’ services, custody, housing moves), rather than a transfer that looks tidy but leaves risk untouched?
A much-needed note on evidence
The “impact” question is the one most audit tools struggle with, because safety is not fully visible in the record. If we want audit to stay honest, we need to name what it can and cannot do. A fit-for-purpose audit should look for credible proxies of changed exposure and changed day-to-day safety, and it should expect triangulation. In other words, we should not pretend a case file can “prove” safety, but we can still test whether there is enough evidence to believe things are shifting.
Examples of audit-able proxies include:
patterns in incidents and intelligence (frequency, severity, location, triggers) and whether these patterns change over time
changes in routines, movements and associations, including safer routes, safer times, safer people, and safer online practices
evidence of effective disruption and protective action across agencies (not only referrals and meetings)
stability indicators that often track safety (housing stability, educational engagement, reduced missing episodes)
what the young person reports over time, when it is safe for them to do so, and how practice adapts when disclosure is partial or risky
whether the plan reduces reliance on “engagement” and increases system-side adaptation (safe spaces, adult presence, boundary work, environmental change)
And where the file cannot carry the weight, audit should point to adjacent methods, such as short case conversations, multi-agency debriefs, or structured feedback gathered safely, so lived experience is not reduced to a checkbox.
The limits of traditional audit frameworks
Many audit tools were built for situations where risk is assumed to sit primarily within the family home. These tools tend to prioritise parenting capacity, home conditions, statutory visiting patterns, completion of assessments and reviews within timescales, and the presence of documents, meetings and recorded decisions.
These measures still have value, and create a floor for safe practice, but when it comes to extra-familial harm and safeguarding young adults, they can be blunt instruments. These measures can tell us procedures were followed, while leaving unanswered whether unsafe contexts were understood, disrupted, or made safer; whether multi-agency work was effective; and whether the young person’s day-to-day experience changed in ways that reduce harm, create safety and nurture agency and welfare.
This is also where audit can unintentionally reinforce a narrow safeguarding lens. If audit mainly scores what is legible in family-focused documentation, it can pull practice back toward family explanations and away from contextual analysis. The file looks complete while harm continues beyond professional sight.
For young adults, the parallel risk looks different. The “file” may not exist in the same way, or it may sit in another part of the system. If our audit approach relies on a single record as the source of information for both responsibility and learning, we can miss what is happening in the lived system response.
Auditing for both safety and assurance
There is a difference between work that is easy to evidence and work that improves safety. Both matter. The problem comes when one crowds out the other. Evidence tends to cluster around what is countable: forms completed, visits recorded, meetings held, risk assessments updated, and timescales met.
Safety from extra-familial harm is rarely achieved by a single action or service. It is built through relational and contextual shifts across the partnership. That is where we see practice that reduces harm, creates safety and nurtures agency and welfare: clearer understanding of routines and peer dynamics, sharper analysis of coercion and loyalty, more effective disruption, stronger multi-agency coordination, action to create safety within environments, and continuity of support through life-transition points.
A fit-for-purpose audit makes room for both, treating compliance as necessary and refusing to let compliance become the whole story. We ask: “Were actions taken?” And then we ask: “What difference did those actions make, and what evidence would we reasonably expect to see if practice was shifting in ways that reduce harm, create safety and nurture agency and welfare?”
Keeping lived experience central
Most audit frameworks state that the young person’s voice is central. In practice, this can collapse into procedural checks: “Was the young person seen alone? Were views recorded?” For young people harmed outside the home, this is rarely enough. Many are negotiating fear, coercion, loyalty, shame, survival strategies, and the everyday pressures of peers and place. What they can say, and when they can say it, is shaped by those realities.
A stronger audit interrogates whether professionals in the system have understood lived realities well enough to make good decisions. A stronger audit asks whether practice is anchored in what the young person experiences as risk and as safety, and whether planning is likely to reduce harm, create safety and nurture agency and welfare. It makes space for questions like:
How does the young person describe what is happening, and what do they avoid naming?
Where do they feel safe, and where do they feel at risk, across people, places and online spaces?
How are routines, movements and relationships being understood and worked with?
Where does planning rely on the young person “complying” or “engaging”, and where is the system adapting to the realities of coercion, trauma and survival?
This is the difference between a young person being “heard in the record” and a young person being understood in decision-making.
Multi-agency practice and cross-boundary grip
Extra-familial harm rarely sits neatly with one service. Creating safety depends on community organisations, education settings, health, housing, police, youth work, and local place-based responses. Transitional Safeguarding adds another layer: while children’s and adults’ systems may both be involved, often neither may consider they hold clear responsibility for a young adult.
Audit still often treats a single organisational record as the primary container for practice, responsibility and learning. In responding to extra-familial harm and safeguarding young adults, this can create predictable gaps. Multi-agency drift can go unchallenged, and disruption activity may not be visible or valued. Contextual interventions can sit outside the audited record, and responsibility can quietly slide back onto families, or onto the young person, even where external harm is the driving force. Handovers at age 18 can become administrative transfers rather than continuity of relationship and safety planning that reduces harm, creates safety and nurtures agency and welfare.
Change the unit of audit: from “file” to “safety episode”
If we want audit to travel across boundaries, we need a unit of analysis that can travel, as well. One practical shift is to audit a safety episode or pathway, anchored on the young person and the time period where harm is escalating, being responded to, and, ideally, being reduced. This kind of audit draws from multiple records and perspectives rather than depending on one file to hold the truth.
A minimum “audit pack” for a safety episode might include:
a one-page timeline of key events, transitions and decision points across agencies
a short contextual map (people, places, online spaces, routines) and what changed over time
a summary of the partnership plan, including disruption and environment actions, not just meetings and referrals
agreed indicators and proxies of shifting exposure and safety for this young person, and the evidence available
a transition check (thresholds, handovers, housing moves, custody, adult service involvement) showing what held steady and what dropped away
a short reflective note on hypotheses tested, what was learned, and what the partnership would do differently next time
This makes cross-boundary practice visible enough to learn from, without claiming that paperwork alone equals safety.
What needs to change in audit practice: two jobs, two tracks
If audit is to stay relevant and impactful, it may need more than minor adjustments. Some of what needs to change is the tool (what we look for). Some is the process (how audit is held, who is involved, and what happens to learning afterwards).
A key contradiction sits underneath many audit programmes: we want audit to provide assurance under scrutiny, and we also want it to create learning without fear. Those are different jobs, and when we pretend they are the same, audit is pulled toward defensible compliance.
One way to resolve this is to run audit on two linked tracks:
Assurance track (the floor): a small set of non-negotiables that confirm basic safety practice and legal duties are met, kept tight and proportionate.
Learning track (the growth): a reflective, partnership-facing review focused on contextual understanding, effectiveness of coordination and disruption, continuity across transitions, and credible evidence of shifting exposure and safety.
For the learning track to work, it needs protection in how it is governed. That means clarity of purpose, separation from performance management, skilled facilitation, and a visible feedback loop so learning leads to changes in practice conditions, not just new tasks.
This is also where continuity with aftercare matters. In our aftercare article, we wrote about how feelings move through systems, shaping what can be said and what kind of learning becomes possible, and how anxiety can drive defensiveness and process-tightening. The same dynamics can show up after a hard audit: if we want honest learning, we need planned support for sense-making afterwards, not just an action plan and silence.
In practical terms, that can mean:
tools that examine contexts, networks and transition points, not only family functioning
attention to professional curiosity and analysis, not just activity: what did practitioners notice, what hypotheses did they test, what are they worried about and why
a focus on continuity, asking what held steady for the young person across moves, thresholds, handovers and changes in legal status
a focus on impact that is evidence-aware, using agreed proxies and triangulation rather than relying on recorded activity alone
multi-agency accountability built into audit, because no single agency can hold the whole picture or deliver safety alone
reflective audit spaces, where audit is collaborative and learning-oriented, rather than deficit-driven scoring that triggers defensive practice
planned “aftercare” for audit outcomes, so teams have space to process what the audit surfaced, integrate learning, and stay open rather than defensive
Audit, in other words, becomes part of the learning infrastructure: a way of keeping real lives in view, keeping partnership practice honest, and strengthening the conditions where practitioners can think and act well together.
A simple test to keep us honest
When you read an audit outcome, ask yourself:
Does this mainly confirm what can be evidenced, or does it illuminate what shifted in safety, and what evidence supports that claim?
Does this strengthen live practice with young people, or does it mainly confirm that we can account for ourselves?
Is learning travelling across the partnership and across transitional ages and stages, or stopping at organisational and statutory boundaries?
If an audit can help us answer those questions well, it becomes more than assurance. It becomes a mechanism for learning that is fit for the realities of harm outside the home, including through the years when services most often fragment.