From Survey to “National Evidence”: How a CMS Statistic Was Amplified in Parliament
![]() |
| Survey of 1.622 separated parents |
There is an important question at the heart of the current debate on the Child Maintenance Service (CMS):
How does a limited survey become “national evidence” in Parliament?
The answer matters — because it goes directly to how policy is being shaped.
The Origin: A Limited Survey
The widely cited figure comes from research by Gingerbread.
Their report was based on:
- 1,622 survey responses
- A self-selecting sample
- Focused specifically on:
- parents with care (receiving parents)
Within that group, the report states:
77% of parents with care using the CMS reported experiencing domestic abuse.
This is a specific finding about a defined group of respondents.
The First Shift: Broader Public Framing
On its website, Gingerbread presents the same statistic as:
“77% of parents using the CMS had experienced domestic abuse from the other parent.”
This wording is materially broader.
It removes key context:
- The survey nature of the data
- The sample size (1,622)
- The restriction to parents with care
The result is a statement that reads as though it applies to CMS users generally.
The Final Step: Parliament
In the Westminster Hall debate on 17 March 2026 tabled by Kirith Entwistle, the statistic was then presented as:
“The national evidence is deeply concerning. Research by Gingerbread… found that 77% of primary carers using the CMS reported experiencing domestic abuse…”
At this stage, the statistic is no longer:
- A survey finding
- A subset analysis
It has become:
“national evidence”
Read the Hansard Transcript (official record)
Watch the debate on Parliament TV
The Escalation Chain
This progression is clear:
- Survey data (limited, contextualised)
- → Broader public wording (context reduced)
- → Parliamentary statement (system-wide implication)
At each stage, the framing becomes wider.
Why This Matters
This is not a technical issue.
It has direct consequences for:
Policy
Statistics framed as “national evidence” influence:
- Legislative reform
- Enforcement policy
- Safeguarding decisions
Public Understanding
It creates the impression that:
The majority of CMS cases involve domestic abuse
That may or may not be true — but this statistic, in its original form, cannot establish that.
Balance of Evidence
At the same time, there is far less prominence given to evidence showing harm elsewhere in the system, including:
- Parliamentary accounts of suicide linked to CMS processes
- Freedom of Information data on deaths within the system
- Evidence of distress and elevated mortality among paying parents
The Core Issue
The issue is not that the statistic exists.
The issue is how it has been presented.
A survey of 1,622 respondents has been progressively reframed as applying to all CMS users and ultimately described in Parliament as “national evidence”.
That is not a minor shift.
It is a material change in meaning.
A Simple Principle
If policy is to be effective, it must be based on:
- Accurate data
- Proper context
- Balanced evidence
When context is lost, even accurate statistics can lead to misleading conclusions.
Final Point
This is not about dismissing the experiences of those who took part in the survey.
Their experiences matter.
But so does how those experiences are represented.
Because when limited data is presented as representative of the whole system, the risk is not just misunderstanding — it is policy built on an incomplete picture.
Should This Be Clarified?
It is important to be clear about responsibility.
Members of Parliament are responsible for the accuracy of statements made in Parliament. However, where research produced by external organisations is relied upon in Parliamentary debate, those organisations are not neutral bystanders.
The statistic cited in the Westminster Hall debate was presented as “national evidence”. Yet, as set out above, the underlying data derives from a self-selecting survey of approximately 1,622 respondents, specifically parents with care, and cannot be treated as representative of the CMS population as a whole.
In these circumstances, while Gingerbread is not responsible for how Members of Parliament choose to frame statistics, it is aware that its research is being used in public and Parliamentary discourse.
That gives rise to a reasonable expectation:
Where research is presented beyond its methodological scope, clarification becomes necessary to avoid misunderstanding.
This is particularly important where:
- The statistic is being used to support policy arguments
- It is described as “national evidence”
- It may influence legislative or enforcement decisions
A simple clarification of scope would not diminish the experiences reflected in the survey.
It would ensure that:
Those experiences are understood in their proper context — rather than as a proxy for the system as a whole.
In the absence of such clarification, a legitimate question arises:
Is the distinction between survey findings and system-wide evidence being sufficiently maintained in public debate?
Where research moves from a limited survey to “national evidence” in Parliament, the issue is no longer just interpretation — it becomes a question of whether the evidential foundation for policy is being accurately represented.
Gingerbread Fix the CMS report: https://www.gingerbread.org.uk/wp-content/uploads/2024/11/Gingerbread-Fix-The-CMS-Report-WEB.pdf
Gingerbread’s post about the debate https://www.gingerbread.org.uk/our-work/news-and-views/gingerbreads-recent-work-on-the-cms/?

Comments
Post a Comment